oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2019 ( 23 )

2018 ( 512 )

2017 ( 597 )

2016 ( 563 )

Custom range...

Search Results: 1 - 10 of 50339 matches for " Megan Yuan Sun "
All listed articles are free for downloading (OA Articles)
Page 1 /50339
Display every page Item
Stock Price Behavior around Extreme Trading Volumes
Megan Yuan Sun
Accounting and Finance Research , 2012, DOI: 10.5430/afr.v2n1p61
Abstract: The study investigates how the coexistence of extreme returns and volumes predicts future stock returns and how the high-volume return premium is affected by the coexistent extreme returns. It also examines the patterns of returns, volatility, and skewness around extreme trading volumes. We find that stocks exhibit different return and volatility patterns prior to and after extreme volumes. We also find that the high-volume return premium only exists among small size stocks which simultaneously experience extremely low prior returns. The high-volume return premium disappears for larger size stocks experiencing extremely low prior returns. Regardless of the firm size, the high-volume return premium only lasts for a very short time period for stocks simultaneously experiencing extremely high prior returns. The existence of extreme volumes cancels out any potential gains from contrarian or momentum investing strategies.
An elementary approach for the phase retrieval problem
Yuan Sun
Mathematics , 2013,
Abstract: If the phase retrieval problem can be solved by a method similar to that of solving a system of linear equations under the context of FFT, the time complexity of computer based phase retrieval algorithm would be reduced. Here I present such a method which is recursive but highly non-linear in nature, based on a close look at the Fourier spectrum of the square of the function norm. In a one dimensional problem it takes $O(N^2)$ steps of calculation to recover the phases of an N component complex vector. This method could work in 1, 2 or even higher dimensional finite Fourier analysis without changes in the behavior of time complexity. For one dimensional problem the performance of an algorithm based on this method is shown, where the limitations are discussed too, especially when subject to random noises which contains significant high frequency components.
The modelling of temporal data in the relational database environment
The Modelling of Temporal Data in the Relational Database Environment

Yuan Sun,
Sun
,Yuan

计算机科学技术学报 , 1995,
Abstract: This research takes the view that the modelling of temporal data is a fundamental step towards the solution of capturing semantics of time. The problemsinhereat in the mod6iling of time are not unique to datahase processing. Therepresentation of temporal knowledge and temporal reasoning arises in a widerange of other disciplines. ln this paper an account is given of a techniquefor modelling the semantics of temporal data and its associated normalizationmcthod. It discusses the techniques of processing temporal data by employinga Time Sequence (TS) data model. It shows a number of different strategieswhich are used to classify different data properties of temporal data, and it goeson.to develop the model of temporal data and addresses issues of temporal dataapplication design by introducing the concept of temporal data normalisation.
SWAT Model Application to Assess the Impact of Intensive Corn-farming on Runoff, Sediments and Phosphorous loss from an Agricultural Watershed in Wisconsin  [PDF]
Eric G. Mbonimpa, Yongping Yuan, Megan H. Mehaffey, Michael A. Jackson
Journal of Water Resource and Protection (JWARP) , 2012, DOI: 10.4236/jwarp.2012.47049
Abstract: The potential future increase in corn-based biofuel may be expected to have a negative impact on water quality in streams and lakes of the Midwestern US due to increased agricultural chemicals usage. This study used the SWAT model to assess the impact of continuous-corn farming on sediment and phosphorus loading in Upper Rock River watershed in Wisconsin. It was assumed that farmers in the area where corn was rotated with soybean would progressively skip soybean for continuous corn as corn became more profitable. Simulations using SWAT indicated that conversion of corn-soybean to corn-corn-soybean would cause 11% and 2% increase in sediment yield and TP loss, respectively. The conversion of corn-soybean to continuous corn caused 55% and 35% increase in sediment yield and TP loss, respectively. However, this increase could be mitigated by applying various BMPs and/or conservation practices such as conservation tillage, fertilizer management and vegetative buffer strips. The conversion to continuous corn tilled with conservation tillage reduced sediment yield by 2% and did not change TP loss. Increase in P fertilizer amount was roughly proportional to increase in TP loss and 11% more TP was lost when fertilizer was applied four months before planting. Vegetative buffer strips, 15 to 30 m wide, around corn farms reduced sediment yield by 51 to 70% and TP loss by 41 to 63%.
Quality Improvement Algorithm for Tetrahedral Mesh Based on Optimal Delaunay Triangulation  [PDF]
Shuli Sun, Haoran Bao, Minghui Liu, Yuan Yuan
Intelligent Information Management (IIM) , 2013, DOI: 10.4236/iim.2013.56021
Abstract:

The concept of optimal Delaunay triangulation (ODT) and the corresponding error-based quality metric are first introduced. Then one kind of mesh smoothing algorithm for tetrahedral mesh based on the concept of ODT is examined. With regard to its problem of possible producing illegal elements, this paper proposes a modified smoothing scheme with a constrained optimization model for tetrahedral mesh quality improvement. The constrained optimization model is converted to an unconstrained one and then solved by integrating chaos search and BFGS (Broyden-Fletcher-Goldfarb-Shanno) algorithm efficiently. Quality improvement for tetrahedral mesh is finally achieved by alternately applying the presented smoothing scheme and re-triangulation. Some testing examples are given to demonstrate the effectiveness of the proposed approach.

Simulation of Fault Arc Using Conventional Arc Models  [PDF]
Ling Yuan, Lin Sun, Huaren Wu
Energy and Power Engineering (EPE) , 2013, DOI: 10.4236/epe.2013.54B160
Abstract:

Conventional arc models are usually used to research the interaction between switching arc and circuit. It is important to simulate the fault arc for arc flash calculations, choice of electrical equipments and power system protection. This paper investigates several conventional arc models for calculating the fault arcing current. Simulation results show that conventional arc models can be used to simulate the fault arc if the parameters of arc models are given properly. This paper provides the parameters of 5 popular arc models and describes the simulation results of the fault arc.

HMM-FRAME: accurate protein domain classification for metagenomic sequences containing frameshift errors
Yuan Zhang, Yanni Sun
BMC Bioinformatics , 2011, DOI: 10.1186/1471-2105-12-198
Abstract: We introduce HMM-FRAME, a protein domain classification tool based on an augmented Viterbi algorithm that can incorporate error models from different sequencing platforms. HMM-FRAME corrects sequencing errors and classifies putative gene fragments into domain families. It achieved high error detection sensitivity and specificity in a data set with annotated errors. We applied HMM-FRAME in Targeted Metagenomics and a published metagenomic data set. The results showed that our tool can correct frameshifts in error-containing sequences, generate much longer alignments with significantly smaller E-values, and classify more sequences into their native families.HMM-FRAME provides a complementary protein domain classification tool to conventional profile HMM-based methods for data sets containing frameshifts. Its current implementation is best used for small-scale metagenomic data sets. The source code of HMM-FRAME can be downloaded at http://www.cse.msu.edu/~zhangy72/hmmframe/ webcite and at https://sourceforge.net/projects/hmm-frame/ webcite.Culture-independent methods and high-throughput sequencing technologies now enable us to obtain community random genomes (metagenomes) from different habitats such as arctic soils and mammalian gut. Currently, metagenomic annotation focuses on phylogenetic complexity and protein composition analysis. An important component in protein composition analysis is protein domain classification, which classifies a putative protein sequence into annotated domain families and thus aids in functional analysis. Profile HMM-based alignment is the state-of-the-art method for protein domain classification because of its high sensitivity in classifying remote homologs [1]. In conjunction with the Pfam database [2], which contains over 10,000 annotated protein domain families, HMMER [3] can accurately classify query protein sequences into existing domain families. In addition, the latest version of HMMER can achieve comparable speed to BLAST, making
Computer Simulation System of Stretch Reducing Mill

BY Sun,SJ Yuan,

金属学报(英文版) , 2007,
Abstract: The principle of the stretch reducing process is analyzed and three models of pass design areestablished. The simulations are done about variables, such as, stress, strain, the stretches betweenthe stands, the size parameters of the steel tube, and the roll force parameters. According to itsproduct catalogs the system can automatically divide the pass series, formulate the rolling table,and simulate the basic technological parameters in the stretch reducing process. All modules areintegrated based on the developing environment of VB6. The system can draw simulation curvesand pass pictures. Three kinds of database including the material database, pass design database,and product database are devised using Microsoft Access, which can be directly edited, corrected,and searched.
Energy Evolution for the Sivers Asymmetries in Hard Processes
Sun, Peng;Yuan, Feng
High Energy Physics - Phenomenology , 2013,
Abstract: We investigate the energy evolution of the azimuthal spin asymmetries in semi-inclusive hadron production in deep inelastic scattering (SIDIS) and Drell-Yan lepton pair production in pp collisions. The scale dependence is evaluated by applying an approximate solution to the Collins-Soper-Sterman (CSS) evolution equation at one-loop order which is adequate for moderate Q^2 variations. This describes well the unpolarized cross sections for SIDIS and Drell-Yan process in the $Q^2$ range of 2.4-100GeV^2. A combined analysis of the Sivers asymmetries in SIDIS from HERMES and COMPASS experiments, and the predictions for the Drell-Yan process at RHIC at \sqrt{S}=200GeV are presented. We further extend to the Collins asymmetries and find, for the first time, a consistent description for HERMES/COMPASS and BELLE experiments with the evolution effects. We emphasize an important test of the evolution effects by studying di-hadron azimuthal asymmetry in e^+e^- annihilation at moderate energy range, such as at BEPC at \sqrt{S}=4.6GeV.
The Relationship between Corporate Social Responsibility and Corporate Value Using the Game Theory
Zhaoliang Sun,Huipeng Yuan
International Journal of Business and Management , 2010, DOI: 10.5539/ijbm.v5n9p166
Abstract: The relationship between corporate social responsibility and corporate value cause the attention of theory circle increasingly. This paper analyzes the mechanism of the corporate social responsibility to corporate value, introducing the game model to justify the legitimacy of corporate social responsibility and demonstrate its function to corporate value. At last, the corresponding countermeasures are put forward.
Page 1 /50339
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.