Abstract:
Mapping the Internet generally consists in sampling the network from a limited set of sources by using "traceroute"-like probes. This methodology, akin to the merging of different spanning trees to a set of destinations, has been argued to introduce uncontrolled sampling biases that might produce statistical properties of the sampled graph which sharply differ from the original ones. Here we explore these biases and provide a statistical analysis of their origin. We derive a mean-field analytical approximation for the probability of edge and vertex detection that exploits the role of the number of sources and targets and allows us to relate the global topological properties of the underlying network with the statistical accuracy of the sampled graph. In particular we find that the edge and vertex detection probability is depending on the betweenness centrality of each element. This allows us to show that shortest path routed sampling provides a better characterization of underlying graphs with scale-free topology. We complement the analytical discussion with a throughout numerical investigation of simulated mapping strategies in different network models. We show that sampled graphs provide a fair qualitative characterization of the statistical properties of the original networks in a fair range of different strategies and exploration parameters. The numerical study also allows the identification of intervals of the exploration parameters that optimize the fraction of nodes and edges discovered in the sampled graph. This finding might hint the steps toward more efficient mapping strategies.

Abstract:
Internet mapping projects generally consist in sampling the network from a limited set of sources by using traceroute probes. This methodology, akin to the merging of spanning trees from the different sources to a set of destinations, leads necessarily to a partial, incomplete map of the Internet. Accordingly, determination of Internet topology characteristics from such sampled maps is in part a problem of statistical inference. Our contribution begins with the observation that the inference of many of the most basic topological quantities -- including network size and degree characteristics -- from traceroute measurements is in fact a version of the so-called `species problem' in statistics. This observation has important implications, since species problems are often quite challenging. We focus here on the most fundamental example of a traceroute internet species: the number of nodes in a network. Specifically, we characterize the difficulty of estimating this quantity through a set of analytical arguments, we use statistical subsampling principles to derive two proposed estimators, and we illustrate the performance of these estimators on networks with various topological characteristics.

Abstract:
Almost all the vast literature on graph exploration assumes that the graph is static: its topology does not change during the exploration, except for occasional faults. To date, very little is known on exploration of dynamic graphs, where the topology is continously changing. The few studies have been limited to the centralized (or post-mortem) case, assuming complete a priori knowledge of the changes and the times of their occurrence, and have only considered fully synchronous systems. In this paper, we start the study of the decentralized (or live) exploration of dynamic graphs, i.e. when the agents operate in the graph unaware of the location and timing of the changes. We consider dynamic rings under the weak 1-interval-connected restriction, and investigate the feasibility of their exploration, in both the fully synchronous and semi-synchronous cases. When exploration is possible we examine at what cost, focusing on the minimum number of agents capable of exploring the ring. We establish several results highlighting the impact that anonymity and structural knowledge have on the feasibility and complexity of the problem.

Abstract:
Mapping the Internet generally consists in sampling the network from a limited set of sources by using traceroute-like probes. This methodology, akin to the merging of different spanning trees to a set of destination, has been argued to introduce uncontrolled sampling biases that might produce statistical properties of the sampled graph which sharply differ from the original ones. In this paper we explore these biases and provide a statistical analysis of their origin. We derive an analytical approximation for the probability of edge and vertex detection that exploits the role of the number of sources and targets and allows us to relate the global topological properties of the underlying network with the statistical accuracy of the sampled graph. In particular, we find that the edge and vertex detection probability depends on the betweenness centrality of each element. This allows us to show that shortest path routed sampling provides a better characterization of underlying graphs with broad distributions of connectivity. We complement the analytical discussion with a throughout numerical investigation of simulated mapping strategies in network models with different topologies. We show that sampled graphs provide a fair qualitative characterization of the statistical properties of the original networks in a fair range of different strategies and exploration parameters. Moreover, we characterize the level of redundancy and completeness of the exploration process as a function of the topological properties of the network. Finally, we study numerically how the fraction of vertices and edges discovered in the sampled graph depends on the particular deployements of probing sources. The results might hint the steps toward more efficient mapping strategies.

Abstract:
The traceroute utility on any computer connected to the Internet can be used to record the roundtrip time for small Internet packets between major Internet traffic hubs. Some of the routes include transmission over transoceanic fiber optic cable. We report on traceroute's use by students to quickly and easily estimate the size of the Earth. This is an inexpensive way to involve introductory physics students in a hands-on use of scientific notation and to teach them about systematics in data.

Abstract:
The concept of intergenerational equity concerning intertemporal paths of consumption and capital accumulation is introduced and the analysis of the dynamic processes of capital accumulation and changes in environmental quality that are intergenerationally equitable is developed. The analysis is based upon the dynamic duality principles, as originally developed by Koopmans and Uzawa, and later extended to the case involving environmental quality.

Abstract:
We present a general form of attribute exploration, a knowledge completion algorithm from Formal Concept Analysis. The aim of our presentation is not only to extend the applicability of attribute exploration by a general description. It may also allow to view different existing variants of attribute exploration as instances of a general form, which may simplify theoretical considerations.

Abstract:
Traceroute is widely used: from the diagnosis of network problems to the assemblage of internet maps. Unfortu- nately, there are a number of problems with traceroute methodology, which lead to the inference of erroneous routes. This paper studies particular structures arising in nearly all traceroute measurements. We characterize them as "loops", "cycles", and "diamonds". We iden- tify load balancing as a possible cause for the appear- ance of false loops, cycles and diamonds, i.e., artifacts that do not represent the internet topology. We pro- vide a new publicly-available traceroute, called Paris traceroute, which, by controlling the packet header con- tents, provides a truer picture of the actual routes that packets follow. We performed measurements, from the perspective of a single source tracing towards multiple destinations, and Paris traceroute allowed us to show that many of the particular structures we observe are indeed traceroute measurement artifacts.

Abstract:
Virtual Learning Environments are increasingly becoming part of the medical curriculum. In a previous study we (luursema et al., 2006) found that a combination of computer-implemented stereopsis (visual depth through seeing with both eyes) and dynamic exploration (being able to continuously change one's viewpoint relative to the studied objects in real time) is beneficial to anatomical learning, especially for subjects of low visuo spatial ability (the ability to form, retrieve, and manipulate mental representations of a visuo-spatial nature). A follow-up study (luursema et al., 2008) found the contribution of computer-implemented stereopsis to this effect to be small but significant. The present experiment investigated the contribution of dynamic exploration to anatomical learning by means of a virtual learning environment. Seventy participants were tested for visuo-spatial ability and were grouped in pairs matched for this ability. One individual of the pair actively manipulated a 3D reconstruction of the human abdomen; the other individual passively watched the interactions of the first individual on a separate screen. Learning was assessed by two anatomical learning tests. Dynamic exploration provided a small but significant benefit to anatomical learning. 1. Introduction Increasingly, Virtual Learning Environments (VLEs) are becoming a staple of the medical curriculum. Surgical simulators help young surgeons train laparoscopic skills (e.g., [1]), and electronically enhanced manikins add physiological parameters to procedures traditionally trained on nonaugmented manikins. In anatomical learning too, VLEs are increasingly used to complement traditional media such as anatomical atlases, anatomical manikins, and dissection [2]. Acquiring a mental model of human anatomy, including its visuospatial aspects, provides the medical student with an essential framework for any further study in the medical field. In searching to optimize the effectiveness of VLEs for anatomical learning, earlier research assessed the effect of a combination of computer implemented stereopsis and dynamic exploration, and the effectiveness of computer-implemented stereopsis alone on this learning [3, 4]. The research reported here continues this series by investigating the contribution of computer implemented dynamic exploration to anatomical learning. We will mostly ignore the vast literature on surgical training simulators here, as this literature is concerned with training motor skills, with little relevance to our focus on spatial cognition. Anatomical learning is largely

Abstract:
We present an approach to the dynamic valuation of exposure risks in the multi-period setting, which incorporates a dynamic and multiple diversification of risks in Pareto optimal sense. This approach extends classical indifference premium principles and can be applied for the valuation of insurance risks. In particular, our method produces explicit computation formulas for the dynamic version of the exponential premium principles. Moreover, we show limit theorems asserting that the risk loading for our valuation decreases to zero when the number of divisions of a risk goes to infinity.