Abstract:
This draft suggests a new counterexample guided abstraction refinement (CEGAR) framework that uses the combination of numerical simulation for nonlinear differential equations with linear programming for linear hybrid automata (LHA) to perform reachability analysis on nonlinear hybrid automata. A notion of $\epsilon-$ structural robustness is also introduced which allows the algorithm to validate counterexamples using numerical simulations. Keywords: verification, model checking, hybrid systems, hybrid automata, robustness, robust hybrid systems, numerical simulation, cegar, abstraction refinement.

Abstract:
It is a theorem of Bers that any closed hyperbolic surface admits a pants decomposition consisting of curves of bounded length where the bound only depends on the topology of the surface. The question of the quantification of the optimal constants has been well studied and the best upper bounds to date are linear in genus, a theorem of Buser and Sepp\"al\"a. The goal of this note is to give a short proof of an linear upper bound which slightly improves the best known bounds.

Abstract:
The present study was conducted in an experimental watershed in Attica, Greece, using observed rainfall/runoff events. The objective of the study was the determination of the initial abstraction ratio of the watershed. The average ratio (Ia/S) of the entire watershed was equal to 0.014. The corresponding ratio at a subwatershed was 0.037. The difference was attributed to the different spatial distribution of landuses and geological formations at the extent of the watershed. Both of the determined ratios are close to the ratio value of 0.05 that has been suggested from many studies for the improvement of the SCS-CN method.

Abstract:
Simulators are generally used during the design of computer architectures. Typically, different simulators with different levels of complexity, speed and accuracy are used. However, for early design space exploration, simulators with less complexity, high simulation speed and reasonable accuracy are desired. It is also required that these simulators have a short development time and that changes in the design require less effort in the implementation in order to perform experiments and see the effects of changes in the design. These simulators are termed high-level simulators in the context of computer architecture. In this paper, we present multiple levels of abstractions in a high-level simulation of a general-purpose many-core system, where the objective of every level is to improve the accuracy in simulation without significantly affecting the complexity and simulation speed.

Abstract:
Executable specifications and simulations are cornerstone to system design flows. Complex-mixed-signal embedded systems can be specified with SystemC AMS which supports abstraction and extensible models of computation. The language contains semantics for module connections and synchronization required in analog and digital interaction. Through the synchronization layer, user defined models of computation, solvers and simulators can be unified in the SystemC AMS simulator for achieving low-level abstraction and model refinement. These improvements assist in amplifying model aspects and their contribution to the overall system behavior. This work presents cosimulating refined models with timed data flow paradigm of SystemC AMS. The methodology uses C-based interaction between simulators. An RTL model of data encryption standard is demonstrated as an example. The methodology is flexible and can be applied in early design decision tradeoff, architecture experimentation, and particularly for model refinement and critical behavior analysis.

Abstract:
Executable specifications and simulations are cornerstone to system design flows. Complex-mixed-signal embedded systems can be specified with SystemC AMS which supports abstraction and extensible models of computation. The language contains semantics for module connections and synchronization required in analog and digital interaction. Through the synchronization layer, user defined models of computation, solvers and simulators can be unified in the SystemC AMS simulator for achieving low-level abstraction and model refinement. These improvements assist in amplifying model aspects and their contribution to the overall system behavior. This work presents cosimulating refined models with timed data flow paradigm of SystemC AMS. The methodology uses C-based interaction between simulators. An RTL model of data encryption standard is demonstrated as an example. The methodology is flexible and can be applied in early design decision tradeoff, architecture experimentation, and particularly for model refinement and critical behavior analysis.

Abstract:
In problems where a distribution is concentrated in a lower-dimensional subspace, the covariance matrix faces a singularity problem. In downstream statistical analyzes this can cause a problem as the inverse of the covariance matrix is often required in the likelihood. There are several methods to overcome this challenge. The most well-known ones are the eigenvalue, singular value, and Cholesky decompositions. In this short note, we develop a new method to deal with the singularity problem while preserving the covariance structure of the original matrix. We compare our alternative with other methods. In a simulation study, we generate various covariance matrices that have different dimensions and dependency structures, and compare the CPU times of each approach.

Abstract:
The purpose of this short note, is to rewrite Morozov's formula for correlation functions over the unitary group, in a much simpler form, involving the computation of a single determinant.

Abstract:
In the present Short Note an idea is proposed to explain the emergence and the observation of processes in complex media that are driven by fractional non-Markovian master equations. Particle trajectories are assumed to be solely Markovian and described by the Continuous Time Random Walk model. But, as a consequence of the complexity of the medium, each trajectory is supposed to scale in time according to a particular random timescale. The link from this framework to microscopic dynamics is discussed and the distribution of timescales is computed. In particular, when a stationary distribution is considered, the timescale distribution is uniquely determined as a function related to the fundamental solution of the space-time fractional diffusion equation. In contrast, when the non-stationary case is considered, the timescale distribution is no longer unique. Two distributions are here computed: one related to the M-Wright/Mainardi function, which is Green's function of the time-fractional diffusion equation, and another related to the Mittag-Leffler function, which is the solution of the fractional-relaxation equation.

Abstract:
The present research was conducted at an experimental watershed in the prefecture of Attica, Greece, using the selected observed rainfall-runoff events from a four-year time period. The objectives of this study were two: The first was the determination of the initial abstraction Ia – watershed storage S ratio. The average ratio (Ia/S) was equal to 0.014. The corresponding ratio at a subwatershed was 0.037. The difference was attributed to the different spatial distribution of landuses at the extent of the watershed. The second objective of the study was to examine the effect of the SCS empirical equation on hydrograph simulation. This was investigated through the comparison between the observed and two different simulated hydrographs at each one out of eighteen selected storm events. The simulated hydrographs were calculated by applying on the watershed's unit hydrograph two time distributions of excess rainfall that derived from the SCS method using two different approaches. In the first approach, the initial abstraction was determined from the observed rainfall-runoff data, while in the second, it was calculated using the SCS empirical equation. It was found that the SCS empirical equation estimates greater amount of initial abstraction and leads to the delayed start of the excess rainfall and the simulated runoff. This resulted in the overestimation of the peak flow rate and the time to peak at the majority of the storm events.