oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Any time

2020 ( 6 )

2019 ( 119 )

2018 ( 444 )

2017 ( 429 )

Custom range...

Search Results: 1 - 10 of 14281 matches for " Data Partion "
All listed articles are free for downloading (OA Articles)
Page 1 /14281
Display every page Item
Stereo Video Transmission Using LDPC Code  [PDF]
Rui GUO, Lixin WANG, Xiaoxia JIANG
Int'l J. of Communications, Network and System Sciences (IJCNS) , 2008, DOI: 10.4236/ijcns.2008.13031
Abstract: Stereo video is widely used because it can provide depth information. However, it is difficult to store and transmit stereo video due to the huge data amount. So, high efficient channel encoding algorithm and proper transmission strategy is needed to deal with the video transmission over limited bandwidth channel. In this paper, unequal error protection (UEP) based on low density parity check (LDPC) code was used to transmit stereo video over wireless channel with limited bandwidth. Different correction level LDPC code was used according to the importance of video stream to reconstruction at the receiver. Simulation result shows that the proposed transmission scheme increases the PSNR of reconstructed image, and improves the subjective effect.
Software Design of Engine Characteristic Simulation
Fachao Jiang,Molin Wang,Lin Li
Journal of Software , 2012, DOI: 10.4304/jsw.7.2.316-321
Abstract: Engine mathematical model can be used to simulate vehicle dynamic performance and fuel economy for vehicle design. Simulation of vehicle dynamic performance and fuel economy is based on mathematical model of engine. Least square method and multiple regression analysis were used to fit curve of external characteristics and universal characteristics when the test data of external characteristics and universal characteristics of engine are known. Torque and fuel consumption of engine can be calculated under the different operating conditions by the equation. Grid quantization method was used to simulate the characteristic when the test data of external characteristics and mapping characteristics are unknown and only the engine characteristics curve is known.
Vague Partions
Vague划分

LIANG Jia-rong,LIU Li,WU Hua-jian,
梁家荣
,刘力,伍华健

计算机科学 , 2009,
Abstract: The concepts of degree of truth-compatibility, degree of false-compatibility degree of truth-equality, degree of falscequality based on t-norm and t-conorm were introduced, because a vague set has the characteristic of truth-membership function and fase-membership function. Futhermore we presented the concepts of semi-vague partitions and vague partitions by using locically degree of truth-compatibility, degree of false-compatibility degree of truth-equality,degree of falscequality, and we investigated the characters of semi-vague partitions and vague partitions.
Research of Big Data Based on the Views of Technology and Application  [PDF]
Zan Mo, Yanfei Li
American Journal of Industrial and Business Management (AJIBM) , 2015, DOI: 10.4236/ajibm.2015.54021
Abstract: In the era of big data, large amounts of data affect our work, life and study, even national economic development. It provides a new way of thinking and approaches to analyze and solve problems, which gradually becomes a hot research. Based on describing the concept and characteristics of big data, this paper describes the development of technologies in big data analysis and storage and analyses the trends and different values in commercial applications, manufacturing, biomedical science and other applications. At last, the authors sum up the existent challenges of big data applications and put forward the view that we should deal with big data challenges correctly.
Data Modeling and Data Analytics: A Survey from a Big Data Perspective  [PDF]
André Ribeiro, Afonso Silva, Alberto Rodrigues da Silva
Journal of Software Engineering and Applications (JSEA) , 2015, DOI: 10.4236/jsea.2015.812058
Abstract: These last years we have been witnessing a tremendous growth in the volume and availability of data. This fact results primarily from the emergence of a multitude of sources (e.g. computers, mobile devices, sensors or social networks) that are continuously producing either structured, semi-structured or unstructured data. Database Management Systems and Data Warehouses are no longer the only technologies used to store and analyze datasets, namely due to the volume and complex structure of nowadays data that degrade their performance and scalability. Big Data is one of the recent challenges, since it implies new requirements in terms of data storage, processing and visualization. Despite that, analyzing properly Big Data can constitute great advantages because it allows discovering patterns and correlations in datasets. Users can use this processed information to gain deeper insights and to get business advantages. Thus, data modeling and data analytics are evolved in a way that we are able to process huge amounts of data without compromising performance and availability, but instead by “relaxing” the usual ACID properties. This paper provides a broad view and discussion of the current state of this subject with a particular focus on data modeling and data analytics, describing and clarifying the main differences between the three main approaches in what concerns these aspects, namely: operational databases, decision support databases and Big Data technologies.
Seismic Data Collection with Shakebox and Analysis Using MapReduce  [PDF]
Bin Tang, Jianchao Han, Mohsen Beheshti, Garrett Poppe, Liv Nguekap, Rashid Siddiqui
Journal of Computer and Communications (JCC) , 2015, DOI: 10.4236/jcc.2015.35012
Abstract:

In this paper we study a seismic sensing platform using Shakebox, a low-noise and low-power 24- bit wireless accelerometer sensor. The advances of wireless sensor offer the potential to monitor earthquake in California at unprecedented spatial and temporal scales. We are exploring the possibility of incorporating Shakebox into California Seismic Network (CSN), a new earthquake monitoring system based on a dense array of low-cost acceleration seismic sensors. Compared to the Phidget/Sheevaplug sensors currently used in CSN, the Shakebox sensors have several advantages. However, Shakebox sensor collects 4K Bytes of seismic data per second, giving around 0.4G Bytes of data in a single day. Therefore how to process such large amount of seismic data becomes a new challenge. We adopt Hadoop/MapReduce, a popular software framework for processing vast amounts of data in-parallel on large clusters of commodity hardware. In this research, the test bed-generated seismic data generation will be reported, the map and reduce function design will be presented, the application of MapReduce on the testbed-generated data will be illustrated, and the result will be analyzed.

Data Mining of Historic Hydrogeological and Socioeconomic Data Bases of the Toluca Valley, Mexico  [PDF]
Oliver López-Corona, Oscar Escolero Fuentes, Eric Morales-Casique, Pablo Padilla Longoria, Tomás González Moran
Journal of Water Resource and Protection (JWARP) , 2016, DOI: 10.4236/jwarp.2016.84044
Abstract: In this paper we used several data mining techniques to analyze the coevolution of hydrogeological and socioeconomical data for the Toluca Valley in Mexico. We found non trivial relations between two historic data bases that make clear that groundwater and economy may be much more linked than it was thought before. In particular, we found that hydrogeological data trends change during economical crisis and election years in Mexico. This shows that different macroeconomical policies implemented by several administrations have a direct impact in the way groundwater is used. We also found that hydrogoelogical data evolve in the direction of population transformation from rural to urban, which could represent a whole paradigm shift in groundwater management with profound repercussions in policy making.
The New Trend and Application of Customer Relationship Management under Big Data Background  [PDF]
Lan Wang
Modern Economy (ME) , 2016, DOI: 10.4236/me.2016.78087
Abstract: One of the important trends of marketing management is digitization. The concept of digitization has been imported to many fields and is well known to everyone. However, because of the limitation of digitization’s meaning and expansion, companies have different understandings of it. The meaning of the digital CRM is a digital customer experience that is specially built and customer-oriented. It is a business reform that improves value creation. Current companies not only focus on the effects brought by technologies, but also focus on how the digital business mode makes profits. This article explains the evolution of marketing management from traditional CRM to analytical CRM to digital CRM. Based on the characteristics of digital CRM, we discussed the new trend and application of CRM.
Research on Personal Privacy Protection of China in the Era of Big Data  [PDF]
Hui Zhao, Haoxin Dong
Open Journal of Social Sciences (JSS) , 2017, DOI: 10.4236/jss.2017.56012
Abstract: The purpose of this essay is to investigate the privacy concerns of Chinese, and to develop relevant protective measures. The groups are divided into two parts by gender and six parts by ages to analyze the different gender and different age groups of privacy concerns. The significance of this study is protecting personal data property. The data of personal information after finishing processing have economic value. These data once disclosed, will be not reversible, so it is important to study the personal privacy in the era of big data and to initiate and enforce legal and regulatory protection measures. Results show that Chinese’s privacy in public places for Internet records, friends dynamic and age’s awareness is insufficient; most people especially female lack privacy protection skills. Educators need to improve the relevant laws and regulations, promote privacy protection skills and strengthen the conception of privacy.
Precise Forecast and Application of Time Delay Receiving Schedule for a New Generation of Polar Orbit Meteorological Satellite  [PDF]
Zhaohui Cheng, Manyun Lin, Cunqun Fan
Journal of Geographic Information System (JGIS) , 2018, DOI: 10.4236/jgis.2018.101006
Abstract: In order to finely predict the receiving schedule of the new generation of polar orbit meteorological satellite time-delay data and solve the problem of rapid positioning of lost data, this paper studies and proposes the satellite data recording and satellite program-controlled program, and designs the delay data receiving timeline precision forecasting method. It is concluded that the detection load of polar orbit meteorological satellite in our country has developed from single load to multiple loads, and the detection data need to be downloaded to the ground for processing and application. And as the satellite load increases and the accuracy of each payload detection and channel increases, the amount of probing data will further increase, which in turn will require further increase of the speed of data transmission in the earth. Due to the limitation of the space data transmission frequency band, under the prior art system, the increase of the satellite data transmission rate is limited. On the basis of understanding the working principle of Fengyun-3, the new transmission system will be implemented in terms of data source compression, channel coding, modulation and polarization multiplexing by exploring new weather transmission systems for meteorological satellites in the future upgrade and at the same time analyze ways to avoid inter-satellite interference in order to solve the contradiction between the increase of data volume and the resource of terrestrial data transmission in the existing system.
Page 1 /14281
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.