Abstract:
This paper presents a novel hybrid CMOS/MEMS tilt sensor with a 5° resolution over a 300° range. The device uses a MEMS-based semicircular mass suspended from a rigid body, projecting a shadow onto the CMOS-based optical sensor surface. A one-dimensional photodiode array arranged as a uniformly segmented ring is then used to determine the tilt angle by detecting the position of the semicircular mass. The complete sensor occupies an area of under 2.5 mm × 2.5 mm.

Abstract:
Neuromodulation has wide ranging potential applications in replacing impaired neural function (prosthetics), as a novel form of medical treatment (therapy), and as a tool for investigating neurons and neural function (research). Voltage and current controlled electrical neural stimulation (ENS) are methods that have already been widely applied in both neuroscience and clinical practice for neuroprosthetics. However, there are numerous alternative methods of stimulating or inhibiting neurons. This paper reviews the state-of-the-art in ENS as well as alternative neuromodulation techniques—presenting the operational concepts, technical implementation and limitations—in order to inform system design choices.

Abstract:
This work presents a new method that combines symbol dynamics methodologies with an Ngram algorithm for the detection and prediction of epileptic seizures. The presented approach specifically applies Ngram-based pattern recognition, after data pre-processing, with similarity metrics, including the Hamming distance and Needlman-Wunsch algorithm, for identifying unique patterns within epochs of time. Pattern counts within each epoch are used as measures to determine seizure detection and prediction markers. Using 623 hours of intracranial electrocorticogram recordings from 21 patients containing a total of 87 seizures, the sensitivity and false prediction/detection rates of this method are quantified. Results are quantified using individual seizures within each case for training of thresholds and prediction time windows. The statistical significance of the predictive power is further investigated. We show that the method presented herein, has significant predictive power in up to 100% of temporal lobe cases, with sensitivities of up to 70–100% and low false predictions (dependant on training procedure). The cases of highest false predictions are found in the frontal origin with 0.31–0.61 false predictions per hour and with significance in 18 out of 21 cases. On average, a prediction sensitivity of 93.81% and false prediction rate of approximately 0.06 false predictions per hour are achieved in the best case scenario. This compares to previous work utilising the same data set that has shown sensitivities of up to 40–50% for a false prediction rate of less than 0.15/hour.

Abstract:
In the previous issue of Critical Care, Cohen and colleagues [1] offer a new approach to identifying and describing states of critical illness. The work follows a path, launched by John Siegel and colleagues [2,3] almost two decades ago, toward letting the data themselves define densely populated regions of physiologic state space that collectively represent a clinical condition. Areas of densely and of sparsely populated regions of the state space arise spontaneously from interconnections among various organ systems and their constituent tissues [4].What Cohen and colleagues have added to the analysis are bioinformatic tools developed, applied, and validated in the service of genomic analysis. Heat maps representing relative expression and hierarchical clustering give a sense of similarity of states and their adjacencies in physiologic state space, respectively. But the report has a deeper significance that perhaps can be grasped by inspection of Figure 1.When we clinicians glance up at a bedside physiologic display ('monitor') and look at the heart rate and blood pressure, we obtain the picture seen in Figure 1a. The difficulty is that the present state can be reached from many trajectories, so that the important inverse problem, namely 'what condition led to the particular values of the blood pressure and heart rate', is ill posed in the sense of Hadamard [5,6]. There are essentially an infinite number of trajectories that lead to this point. One approach to clarifying the problem is to generate a mathematical model and then ask what sort of perturbation would offer the most clarification as to the actual condition of the patient [7]. Another approach is to look backwards in time, as in Figure 1b, to see whether there is a clue concerning a trend. Either way, the question/answer that many clinicians think they wish to know is represented in Figure 1c: 'what will the patient's physiology look like at some time in the future, and what is my level of confidence in t

Abstract:
The mind is never passive; it is a perpetual activity, delicate, receptive, responsive to stimulus. You cannot postpone its life until you have sharpened it.Whatever interest attaches to your subject-matter must be evoked here and now; whatever powers you are strengthening in the pupil, must be exercised here and now; whatever possibilities of mental life your teaching should impart, must be exhibited here and now. That is the golden rule of education, and a very difficult rule to follow.(Alfred North Whitehead, Presidential Address to the Mathematical Association, January 1916)Peets and colleagues report on a strategy for selecting content for inclusion in a critical care curriculum for residents [1]. The authors constructed a three-domain classification of common clinical problems and asked resident trainees and attendings to score each problem according to the threat to life, to frequency and to reversibility. The scales were organized to give greatest weight to greater life-threat, higher frequency and ease of reversibility. The authors report strong concurrence between the product of domain scores of resident trainees and of their supervising attending physicians. In their conclusion, the authors assert that their process is widely applicable and 'can facilitate creation of a reliable and valid curriculum' [1].It is unsurprising that residents and their teaching staff should have similar assessments of the three objective features listed. For example, brain death – which appears at the bottom of the priority list – is irreversible by definition. If any resident or attending scored brain death as anything other than not reversible, it would be at once surprising and problematic. Similarly, the frequency of the condition of brain death in the intensive care unit (ICU) studied and the degree to which brain death threatens life are not matters for debate.What is of greater concern, however, is that the methodology advanced by the authors results in brain death bein

Abstract:
It has been forty four years since the American Nurses Association (ANA) published its first position paper on entry into practice advocating that the baccalaureate degree be the minimum degree for entry into registered nurse practice. During that time period, only one state, North Dakota, was successful at getting the entry into practice proposal fully implemented, and even there it was rescinded in 2003. Although there has been much general discussion as to why the proposal initially failed, little has been written on this topic strictly from a policy perspective. This article begins with a brief history of nursing education leading up to the 1965 entry into practice paper (ANA 1965a). This is followed by a look at recent discussions and developments concerning the entry into practice proposal, an examination of the events surrounding the failure of the proposal in the original four focus states identified by the ANA for early implementation, and a discussion identifying possible reasons for the proposal’s initial failures. Encouragement and suggestions for present and future entry into practice supporters are provided.

Abstract:
Let $\mathbb{F}_{q}$ be a finite field of order $q=p^k$ where $p$ is prime. Let $P$ and $L$ be sets of points and lines respectively in $\mathbb{F}_{q} \times \mathbb{F}_{q}$ with $|P|=|L|=n$. We establish the incidence bound $I(P,L) \leq \gamma n^{3/2 - 1/12838}$, where $\gamma$ is an absolute constant, so long as $P$ satisfies the conditions of being an `antifield'. We define this to mean that the projection of $P$ onto some coordinate axis has no more than half-dimensional interaction with large subfields of $\mathbb{F}_q$. In addition, we give examples of sets satisfying these conditions in the important cases $q=p^2$ and $q=p^4$.

Abstract:
This thesis establishes new quantitative records in several problems of incidence geometry and growth. After the necessary background in Chapters 1, 2 and 3, the following results are proven. Chapter 4 gives new results in the incidence geometry of a plane determined by a finite field of prime order. These comprise a new upper bound on the total number of incidences determined by finitely many points and lines, and a new estimate for the number of distinct lines determined by a finite set of non-collinear points. Chapter 5 gives new results on expander functions. First, a new bound is established for the two-variable expander a+ab over a finite field of prime order. Second, new expanders in three and four variables are demonstrated over the real and complex numbers with stronger growth properties than any functions previously considered. Finally, Chapter 6 gives the first bespoke sum-product estimate over function fields, a setting that has so far been largely unexplored for these kinds of problems. This last chapter is joint work with Thomas Bloom.

Abstract:
We use the theory of cross ratios to construct a real-valued function f of only three variables with the property that for any finite set A of reals, the set f(A) = {f(a,b,c):a,b,c \in A} has cardinality at least C|A|^2/log|A|, for an absolute constant C. Previously-known functions with this property had all been of four variables. We also improve on the state of the art for functions of four variables by constructing a function g for which g(A) has cardinality at least C|A|^2; the previously best-achieved bound was C|A|^2/log|A|. Finally, we give an example of a five-variable function h for which h(A) has cardinality at least C|A|^4/log|A|. Proving these results depends only on the Szemeredi-Trotter incidence theorem and an analoguous result for planes due to Edelsbrunner, Guibas and Sharir, each applied in the Erlangen-type framework of Elekes and Sharir. In particular the proofs do not employ the Guth-Katz polynomial partitioning technique or the theory of ruled surfaces. Although the growth exponents for f, g and h are stronger than those for previously considered functions, it is not clear that they are necessarily sharp. So we pose a question as to whether the bounds on the cardinalities of f(A), g(A) and h(A) can be further strengthened.