Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Measuring Spreadsheet Formula Understandability  [PDF]
Felienne Hermans,Martin Pinzger,Arie van Deursen
Computer Science , 2012,
Abstract: Spreadsheets are widely used in industry, because they are flexible and easy to use. Sometimes they are even used for business-critical applications. It is however difficult for spreadsheet users to correctly assess the quality of spreadsheets, especially with respect to their understandability. Understandability of spreadsheets is important, since spreadsheets often have a long lifespan, during which they are used by several users. In this paper, we establish a set of spreadsheet understandability metrics. We start by studying related work and interviewing 40 spreadsheet professionals to obtain a set of characteristics that might contribute to understandability problems in spreadsheets. Based on those characteristics we subsequently determine a number of understandability metrics. To evaluate the usefulness of our metrics, we conducted a series of experiments in which professional spreadsheet users performed a number of small maintenance tasks on a set of spreadsheets from the EUSES spreadsheet corpus. We subsequently calculate the correlation between the metrics and the performance of subjects on these tasks. The results clearly indicate that the number of ranges, the nesting depth and the presence of conditional operations in formulas significantly increase the difficulty of understanding a spreadsheet.
The Detection of Human Spreadsheet Errors by Humans versus Inspection (Auditing) Software  [PDF]
Salvatore Aurigemma,Raymond R. Panko
Computer Science , 2010,
Abstract: Previous spreadsheet inspection experiments have had human subjects look for seeded errors in spreadsheets. In this study, subjects attempted to find errors in human-developed spreadsheets to avoid the potential artifacts created by error seeding. Human subject success rates were compared to the successful rates for error-flagging by spreadsheet static analysis tools (SSATs) applied to the same spreadsheets. The human error detection results were comparable to those of studies using error seeding. However, Excel Error Check and Spreadsheet Professional were almost useless for correctly flagging natural (human) errors in this study.
Data analysis and graphing in an introductory physics laboratory: spreadsheet versus statistics suite  [PDF]
Primoz Peterlin
Physics , 2010, DOI: 10.1088/0143-0807/31/4/021
Abstract: Two methods of data analysis are compared: spreadsheet software and a statistics software suite. Their use is compared analyzing data collected in three selected experiments taken from an introductory physics laboratory, which include a linear dependence, a non-linear dependence, and a histogram. The merits of each method are compared.
The Case Analysis of the Scandal of Enron  [cached]
Yuhao Li
International Journal of Business and Management , 2010, DOI: 10.5539/ijbm.v5n10p37
Abstract: The Enron scandal, revealed in October 2001, eventually led to the bankruptcy of the Enron Corporation, an American energy company based in Houston, Texas, and the dissolution of Arthur Andersen, which was one of the five largest audit and accountancy partnerships in the world. In addition to being the largest bankruptcy reorganization in American history at that time, Enron undoubtedly is the biggest audit failure. It is ever the most famous company in the world, but it also is one of companies which fell down too fast. In this paper, it analysis the reason for this event in detail including the management, conflict of interest and accounting fraud. Meanwhile, it makes analysis the moral responsibility From Individuals’ Angle and Corporation’s Angle.
Network Analysis with the Enron Email Corpus  [PDF]
Johanna Hardin,Ghassan Sarkis,P. C. Urc
Computer Science , 2014,
Abstract: We use the Enron email corpus to study relationships in a network by applying six different measures of centrality. Our results came out of an in-semester undergraduate research seminar. The Enron corpus is well suited to statistical analyses at all levels of undergraduate education. Through this note's focus on centrality, students can explore the dependence of statistical models on initial assumptions and the interplay between centrality measures and hierarchical ranking, and they can use completed studies as springboards for future research. The Enron corpus also presents opportunities for research into many other areas of analysis, including social networks, clustering, and natural language processing.
Spreadsheet Refactoring  [PDF]
Patrick O'Beirne
Computer Science , 2010,
Abstract: Refactoring is a change made to the internal structure of software to make it easier to understand and cheaper to modify without changing its observable behaviour. A database refactoring is a small change to the database schema which improves its design without changing its semantics. This paper presents example 'spreadsheet refactorings', derived from the above and taking into account the unique characteristics of spreadsheet formulas and VBA code. The techniques are constrained by the tightly coupled data and code in spreadsheets.
The Rise and Collapse of Enron: Financial Innovation, Errors and Lessons  [cached]
Elisa S. Moncarz,Raúl Moncarz,Alejandra Cabello,Benjamin Moncarz
Contaduría y administración , 2006,
Abstract: Recent collapses of high profile business failures like Enron, Worldcom, Parmlat, and Tyco has been a subject of great debate among regulators, investors, government and academics in the recent past. Enrons case was the greatest failure in the history of American capitalism and had a major impact on financial markets by ′causing significant losses to investors. Enron was a company ranked by Fortune as the most innovative company in the United States; it exemplified the transition from the production to the knowledge economy. Many lessons can we learn from its collapse. In this paper we present an analysis of the factors that contributed to Enron′s rise and failure, underlying the role that energy deregulation and manipulation of financial statements played on Enron′s demise. We summarize some lessons that can be learned in order to prevent another Enron and restore confidence in the financial markets, as well as in the accounting and auditing professions.
Spreadsheet Hell  [PDF]
Simon Murphy
Computer Science , 2008,
Abstract: This management paper looks at the real world issues faced by practitioners managing spreadsheets through the production phase of their life cycle. It draws on the commercial experience of several developers working with large corporations, either as employees or consultants or contractors. It provides commercial examples of some of the practicalities involved with spreadsheet use around the enterprise.
Spreadsheet Debugging  [PDF]
Yirsaw Ayalew,Roland Mittermeir
Computer Science , 2008,
Abstract: Spreadsheet programs, artifacts developed by non-programmers, are used for a variety of important tasks and decisions. Yet a significant proportion of them have severe quality problems. To address this issue, our previous work presented an interval-based testing methodology for spreadsheets. Interval-based testing rests on the observation that spreadsheets are mainly used for numerical computations. It also incorporates ideas from symbolic testing and interval analysis. This paper addresses the issue of efficiently debugging spreadsheets. Based on the interval-based testing methodology, this paper presents a technique for tracing faults in spreadsheet programs. The fault tracing technique proposed uses the dataflow information and cell marks to identify the most influential faulty cell(s) for a given formula cell containing a propagated fault.
A Primer on Spreadsheet Analytics  [PDF]
Thomas A. Grossman
Computer Science , 2008,
Abstract: This paper provides guidance to an analyst who wants to extract insight from a spreadsheet model. It discusses the terminology of spreadsheet analytics, how to prepare a spreadsheet model for analysis, and a hierarchy of analytical techniques. These techniques include sensitivity analysis, tornado charts,and backsolving (or goal-seeking). This paper presents native-Excel approaches for automating these techniques, and discusses add-ins that are even more efficient. Spreadsheet optimization and spreadsheet Monte Carlo simulation are briefly discussed. The paper concludes by calling for empirical research, and describing desired features spreadsheet sensitivity analysis and spreadsheet optimization add-ins.
Page 1 /100
Display every page Item

Copyright © 2008-2017 Open Access Library. All rights reserved.