oalib

Publish in OALib Journal

ISSN: 2333-9721

APC: Only $99

Submit

Search Results: 1 - 10 of 609 matches for " Bob Zeidman "
All listed articles are free for downloading (OA Articles)
Page 1 /609
Display every page Item
Source Code Comparison of DOS and CP/M  [PDF]
Bob Zeidman
Journal of Computer and Communications (JCC) , 2016, DOI: 10.4236/jcc.2016.412001
Abstract: In a previous paper [1], I compared DOS from Microsoft and CP/M from Digital Research Inc. (DRI) to determine whether the original DOS source code had been copied from CP/M source code as had been rumored for many years [2] [3]. At the time, the source code for CP/M was publicly available but the source code for DOS was not. My comparison was limited to the comparison of the DOS 1.11 binary code and the source code for CP/M 2.0 from 1981. Since that time, the Computer History Museum in Mountain View, California received the source code for DOS 2.0 from Microsoft and was given permission to make it public. The museum also received the source code for DOS 1.1 from Tim Paterson, the developer who was originally contracted by Microsoft to write DOS. In this paper, I perform a further analysis using the newly accessible source code and determine that no code was copied. I further conclude that the commands were not copied but that a substantial number of the system calls were copied.
A Code Correlation Comparison of the DOS and CP/M Operating Systems  [PDF]
Robert Zeidman
Journal of Software Engineering and Applications (JSEA) , 2014, DOI: 10.4236/jsea.2014.76048
Abstract: For years, rumors have circulated that the code for the original DOS operating system created by Microsoft for the IBM personal computer is actually copied from the CP/M operating system developed by Digital Research Incorporated. In this paper, scientifically tested and accepted forensic analysis mathematical techniques, step-by-step processes, and advanced software code comparison tools are used to compare early versions of the two code bases. The conclusion is reached that no copying of code takes place1.
Measuring Whitespace Pattern Sequences as an Indication of Plagiarism  [PDF]
Nikolaus Baer, Robert Zeidman
Journal of Software Engineering and Applications (JSEA) , 2012, DOI: 10.4236/jsea.2012.54029
Abstract: There are several methods and technologies for comparing the statements, comments, strings, identifiers, and other visible elements of source code in order to efficiently identify similarity. In a prior paper we found that comparing the whitespace patterns was not precise enough to identify copying by itself. However, several possible methods for improving the precision of a whitespace pattern comparison were presented, the most promising of which was an examination of the sequences of lines with matching whitespace patterns. This paper demonstrates a method of evaluating the sequences of matching whitespace patterns and a detailed study of the method’s reliability.
Iterative Filtering of Retrieved Information to Increase Relevance
Robert Zeidman
Journal of Systemics, Cybernetics and Informatics , 2007,
Abstract: Efforts have been underway for years to find more effective ways to retrieve information from large knowledge domains. This effort is now being driven particularly by the Internet and the vast amount of information that is available to unsophisticated users. In the early days of the Internet, some effort involved allowing users to enter Boolean equations of search terms into search engines, for example, rather than just a list of keywords. More recently, effort has focused on understanding a user's desires from past search histories in order to narrow searches. Also there has been much effort to improve the ranking of results based on some measure of relevancy. This paper discusses using iterative filtering of retrieved information to focus in on useful information. This work was done for finding source code correlation and the author extends his findings to Internet searching and e-commerce. The paper presents specific information about a particular filtering application and then generalizes it to other forms of information retrieval.
About the Causes of the Koror Bridge Collapse  [PDF]
Corneliu Bob
Open Journal of Safety Science and Technology (OJSST) , 2014, DOI: 10.4236/ojsst.2014.42013
Abstract:
This paper has been prepared from basic works published, mainly, after 2008 when the collapse investigation was made available. The main contributions of the paper are: a proper model for the deflections at mid-span of the bridge, the state of stress in elastic and post elastic stage for same phases of behavior, the stage of cracked of the top of cantilever beams, the repair effect on the structure of Koror Bridge, the probabilistic evaluation. The present study is based on well known and simple engineer tools: the one-dimensional beam-type was analyzed.
Climate Shifts and the Role of Nano Structured Particles in the Atmosphere  [PDF]
Bob Ursem
Atmospheric and Climate Sciences (ACS) , 2016, DOI: 10.4236/acs.2016.61005
Abstract: A global net sum equilibrium in heat exchange is a fact and thus a global climate change doesn’t exist, but climate shifts in climate cells, especially in the northern temperate cell, do. The global climate has been ever since homeostatic, and has recuperated far huger climate impacts in the past. Current climate models need a drastically revision on the focus of carbon dioxide as main driver. Carbon dioxide and other carbon gasses do influence albedo patterns, but provide globally a homeostatic effect with a commonly accepted increase impact of 0.3 degrees Celsius. Carbon dioxide does not trigger the climate shifts, but is an indicator of exhaust of combustion processes that emit very small particles which drive these climate shifts. They are the fine dust and nano structured particles that cause the shifts of the climate in cells, as demonstrated in this article and results i.e. in more thunder and lightning, extreme weather, distinct droughts and precipitation patterns. The causes underlying these shifts are nano structured particles in the upper troposphere and lower stratosphere, especially largely produced and remain in the temperate climate northern hemisphere cell and get dispersed by jet streams and low and high pressure areas. However, because of electrical charge, caused by friction or due to anthropogenic negatively charged nano structured particle, emissions will travel up to the lower stratosphere and become neutralized at the electro sphere level, and they do also have a tendency to move to the Arctic. The southern hemisphere climate faces limited anthropogenic emissions, because only 10 percent of the world population can contribute with less pollutant providing activities, and hasn’t changed, but that could well be because it is equally influenced and driven, like the northern hemisphere, by the variation of sun activity in diverse cycles. The present problem is that we produce huge amounts of air borne nano structured particles from combustion processes that never exist before. The only nano particles known in nature are those who are limited produced from volcano eruptions and natural forest fires. The natural feedback systems that moderate climate shifts and influence global climate are: convection by cumulonimbus clouds, sea currents and vegetation adaptation. A novel ultra-fine dust electric reduction device (UFDRS-System), created by the author, diminishs to a size of less than 10 nano particles in diameter and thus prevents major electrical drift of nano structured particulates in the upper troposphere and lower
The Scientific Evidence That “Intent” Is Vital for Healthcare  [PDF]
Bob Johnson
Open Journal of Philosophy (OJPP) , 2017, DOI: 10.4236/ojpp.2017.74022
Abstract: THINKING cannot occur without electrons, a point philosophically, scientifically and irrefutably confirmed for all, by the Electroencephalogram (EEG). However for 100 years, electrons and their ilk have scrupulously obeyed the Uncertainty Principle. Probability rules. The way human beings reason is by concluding that if event B is seen to follow cause A, it will do so again tomorrow—electrons don’t even support this today. Hume’s critique of causality which Kant failed to refute, gains traction from Quantum Mechanics. Despite needing to insert the word “probably” into every human reasoning, healthcare demonstrates an element of unexpected stability. The label “intent” is expanded to cover this anomaly, endeavouring to highlight how living cells cope with the impact of this unknowability, this Uncertainty. Mental health follows suit, though here the uncertainty comes additionally from “blockage” of the frontal lobes consequent upon trauma/terror. The collapse of today’s psychiatry is pathognomonic, and medically solipsistic. The role of “intent”, and its close relative, consent, are offered as remedies, not only for mental disease, relabelled here “social defeat”, but also for the global disease of violence, culminating in the biggest health threat of them all, thermonuclear war.
The Sociology of Knowledge, Citizenship and the Purification of Politics  [PDF]
Jed Donoghue, Bob White
Sociology Mind (SM) , 2013, DOI: 10.4236/sm.2013.31003
Abstract:

We reinterpret citizenship using Mannheim’s classical sociology of knowledge and through a more recent variant on them in Latour’s argument that “we have never been modern” (Latour, 1991). On that basis, we understand citizenship as a recursive effect of disputes over belonging and membership (Isin, 2002), where those disputes entail the three forms of political rationality or “thought styles” which Mannheim and Latour variously suggested: the linearly individual rationality of liberalism; dialectically collective socialism; and culturally collective conservatism. Marshall defines citizenship as a “status bestowed on those who are full members of a community” (Marshall, 1973). He presents an image of evolutionary progress, from civil to political rights and finally to the social form, in Britain. We argue that Marshall was entangled in evolutionary and teleological images of citizenship. We reinterpret citizenship using Mannheim’s classical sociology of knowledge. We suggest that sociologies of knowledge allow a re-reading of “citizenship” that can accommodate conceptual difficulties. Mannheim called into question the “progress” implied or stated in theories of “stages”. He stressed instead the continuing interaction between different ways of knowing social reality, or between what he called “thought styles”. We apply Mannheim to “citizenship” in order to lift two “purifications”, so that humanity is both natural and political.

The Impact of Computer Mediated Communication (CMC) on Productivity and Efficiency in Organizations: A Case Study of an Electrical Company in Trinidad and Tobago  [PDF]
Kenrick Bob, Prahalad Sooknanan
Advances in Journalism and Communication (AJC) , 2014, DOI: 10.4236/ajc.2014.22005
Abstract: This study investigates how computer mediated communication (CMC) and the electronic mailing system in particular have impacted on productivity and efficiency not to mention interpersonal interaction and increased use of technology in the organization. An electricity company was chosen for the case study since it had achieved approximately 78% computerization and networking of its office staff. Following the judgemental sampling technique to identify the organization, random sampling was used to select a sample size of 100 respondents. A questionnaire survey with sixteen items was self-administered over a one-week period. The results showed that 73% of the respondents agreed that CMC enhanced their overall productivity and efficiency while 27% differed. However, while the findings revealed that the introduction of CMC increased its use as a whole, it impacted negatively on interpersonal relationships among respondents.
Forward and Backward Inference in Spatial Cognition
Will D. Penny ,Peter Zeidman,Neil Burgess
PLOS Computational Biology , 2013, DOI: 10.1371/journal.pcbi.1003383
Abstract: This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of ‘lower-level’ computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.
Page 1 /609
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.