Abstract:
Social networks are discrete systems with a large amount of heterogeneity among nodes (individuals). Measures of centrality aim at a quantification of nodes' importance for structure and function. Here we ask to which extent the most central nodes can be found by purely local search. We find that many networks have close-to-optimal searchability under eigenvector centrality, outperforming searches for degree and betweenness. Searchability of the strongest spreaders in epidemic dynamics tends to be substantially larger for supercritical than for subcritical spreading.

Abstract:
Boolean networks at the critical point have been a matter of debate for many years as, e.g., scaling of number of attractor with system size. Recently it was found that this number scales superpolynomially with system size, contrary to a common earlier expectation of sublinear scaling. We here point to the fact that these results are obtained using deterministic parallel update, where a large fraction of attractors in fact are an artifact of the updating scheme. This limits the significance of these results for biological systems where noise is omnipresent. We here take a fresh look at attractors in Boolean networks with the original motivation of simplified models for biological systems in mind. We test stability of attractors w.r.t. infinitesimal deviations from synchronous update and find that most attractors found under parallel update are artifacts arising from the synchronous clocking mode. The remaining fraction of attractors are stable against fluctuating response delays. For this subset of stable attractors we observe sublinear scaling of the number of attractors with system size.

Abstract:
Many physical and chemical processes, such as folding of biopolymers, are best described as dynamics on large combinatorial energy landscapes. A concise approximate description of dynamics is obtained by partitioning the micro-states of the landscape into macro-states. Since most landscapes of interest are not tractable analytically, the probabilities of transitions between macro-states need to be extracted numerically from the microscopic ones, typically by full enumeration of the state space. Here we propose to approximate transition probabilities by a Markov chain Monte-Carlo method. For landscapes of the number partitioning problem and an RNA switch molecule we show that the method allows for accurate probability estimates with significantly reduced computational cost.

Abstract:
Biological systems rely on robust internal information processing: Survival depends on highly reproducible dynamics of regulatory processes. Biological information processing elements, however, are intrinsically noisy (genetic switches, neurons, etc.). Such noise poses severe stability problems to system behavior as it tends to desynchronize system dynamics (e.g. via fluctuating response or transmission time of the elements). Synchronicity in parallel information processing is not readily sustained in the absence of a central clock. Here we analyze the influence of topology on synchronicity in networks of autonomous noisy elements. In numerical and analytical studies we find a clear distinction between non-reliable and reliable dynamical attractors, depending on the topology of the circuit. In the reliable cases, synchronicity is sustained, while in the unreliable scenario, fluctuating responses of single elements can gradually desynchronize the system, leading to non-reproducible behavior. We find that the fraction of reliable dynamical attractors strongly correlates with the underlying circuitry. Our model suggests that the observed motif structure of biological signaling networks is shaped by the biological requirement for reproducibility of attractors.

Abstract:
The functioning of a living cell is largely determined by the structure of its regulatory network, comprising non-linear interactions between regulatory genes. An important factor for the stability and evolvability of such regulatory systems is neutrality - typically a large number of alternative network structures give rise to the necessary dynamics. Here we study the discretized regulatory dynamics of the yeast cell cycle [Li et al., PNAS, 2004] and the set of networks capable of reproducing it, which we call functional. Among these, the empirical yeast wildtype network is close to optimal with respect to sparse wiring. Under point mutations, which establish or delete single interactions, the neutral space of functional networks is fragmented into 4.7 * 10^8 components. One of the smaller ones contains the wildtype network. On average, functional networks reachable from the wildtype by mutations are sparser, have higher noise resilience and fewer fixed point attractors as compared with networks outside of this wildtype component.

Abstract:
We present a computational method for finding attractors (ergodic sets of states) of Boolean networks under asynchronous update. The approach is based on a systematic removal of state transitions to render the state transition graph acyclic. In this reduced state transition graph, all attractors are fixed points that can be enumerated with little effort in most instances. This attractor set is then extended to the attractor set of the original dynamics. Our numerical tests on standard Kauffman networks indicate that the method is efficient in the sense that the total number of state vectors visited grows moderately with the number of states contained in attractors.

Abstract:
We compare asynchronous vs. synchronous update of discrete dynamical networks and find that a simple time delay in the nodes may induce a reproducible deterministic dynamics even in the case of asynchronous update in random order. In particular we observe that the dynamics under synchronous parallel update can be reproduced accurately under random asynchronous serial update for a large class of networks. This mechanism points at a possible general principle of how computation in gene regulation networks can be kept in a quasi-deterministic ``clockwork mode'' in spite of the absence of a central clock. A delay similar to the one occurring in gene regulation causes synchronization in the model. Stability under asynchronous dynamics disfavors topologies containing loops, comparing well with the observed strong suppression of loops in biological regulatory networks.

Abstract:
Regulatory dynamics in biology is often described by continuous rate equations for continuously varying chemical concentrations. Binary discretization of state space and time leads to Boolean dynamics. In the latter, the dynamics has been called unstable if flip perturbations lead to damage spreading. Here we find that this stability classification strongly differs from the stability properties of the original continuous dynamics under small perturbations of the state vector. In particular, random networks of nodes with large sensitivity yield stable dynamics under small perturbations.

Abstract:
Boolean networks serve as discrete models of regulation and signaling in biological cells. Identifying the key controllers of such processes is important for understanding the dynamical systems and planning further analysis. Here we quantify the dynamical impact of a node as the probability of damage spreading after switching the node's state. We find that the leading eigenvector of the adjacency matrix is a good predictor of dynamical impact in the case of long-term spreading. This so-called eigenvector centrality is also a good proxy measure of the influence a node's initial state has on the attractor the system eventually arrives at. Quality of prediction is further improved when eigenvector centrality is based on the weighted matrix of activities rather than the unweighted adjacency matrix. Simulations are performed with ensembles of random Boolean networks and a Boolean model of signaling in fibroblasts. The findings are supported by analytic arguments from a linear approximation of damage spreading.

Abstract:
We propose a new self-organizing mechanism behind the emergence of memory in which temporal sequences of stimuli are transformed into spatial activity patterns. In particular, the memory emerges despite the absence of temporal correlations in the stimuli. This suggests that neural systems may prepare a spatial structure for processing information before the information itself is available. A simple model illustrating the mechanism is presented based on three principles: (1) Competition between neural units, 2) Hebbian plasticity, and (3) recurrent connections.