Markedets billigste bøger
Levering: 1 - 2 hverdage

Bøger af Hans-Andrea Loeliger

Filter
Filter
Sorter efterSorter Populære
  • af Hans-Andrea Loeliger & Elizabeth Ren
    787,95 kr.

    With increasing availability of computation power, digital signal analysis algorithms have the potential of evolving from the common framewise operational method to samplewise operations which offer more precision in time. This thesis discusses a set of methods with samplewise operations: local signal approximation via Recursive Least Squares (RLS) where a mathematical model is fit to the signal within a sliding window at each sample. Thereby both the signal models and cost windows are generated by Autonomous Linear State Space Models (ALSSMs). The modeling capability of ALSSMs is vast, as they can model exponentials, polynomials and sinusoidal functions as well as any linear and multiplicative combination thereof. The fitting method offers efficient recursions, subsample precision by way of the signal model and additional goodness of fit measures based on the recursively computed fitting cost. Classical methods such as standard Savitzky-Golay (SG) smoothing filters and the Short-Time Fourier Transform (STFT) are united under a common framework.First, we complete the existing framework. The ALSSM parameterization and RLS recursions are provided for a general function. The solution of the fit parameters for different constraint problems are reviewed. Moreover, feature extraction from both the fit parameters and the cost is detailed as well as examples of their use. In particular, we introduce terminology to analyze the fitting problem from the perspective of projection to a local Hilbert space and as a linear filter. Analytical rules are given for computation of the equivalent filter response and the steady-state precision matrix of the cost.After establishing the local approximation framework, we further discuss two classes of signal models in particular, namely polynomial and sinusoidal functions. The signal models are complementary, as by nature, polynomials are suited for time-domain description of signals while sinusoids are suited for the frequency-domain.For local approximation of polynomials, we derive analytical expressions for the steady-state covariance matrix and the linear filter of the coefficients based on the theory of orthogonal polynomial bases. We then discuss the fundamental application of smoothing filters based on local polynomial approximation. We generalize standard SG filters to any ALSSM window and introduce a novel class of smoothing filters based on polynomial fitting to running sums.

  • af Hans-Andrea Loeliger & Raphael Urs Keusch
    972,95 kr.

  • af Patrick Murer & Hans-Andrea Loeliger
    652,95 kr.

    This thesis studies the capability of spiking recurrent neural network models to memorize dynamical pulse patterns (or firing signals). In the first part, discrete-time firing signals (or firing sequences) are considered. A recurrent network model, consisting of neurons with bounded disturbance, is introduced to analyze (simple) local learning. Two modes of learning/memorization are considered: The first mode is strictly online, with a single pass through the data, while the second mode uses multiple passes through the data. In both modes, the learning is strictly local (quasi-Hebbian): At any given time step, only the weights between the neurons firing (or supposed to be firing) at the previous time step and those firing (or supposed to be firing) at the present time step are modified. The main result is an upper bound on the probability that the single-pass memorization is not perfect. It follows that the memorization capacity in this mode asymptotically scales like that of the classical Hopfield model (which, in contrast, memorizes static patterns). However, multiple-rounds memorization is shown to achieve a higher capacity with an asymptotically nonvanishing number of bits per connection/synapse. These mathematical findings may be helpful for understanding the functionality of short-term memory and long-term memory in neuroscience.In the second part, firing signals in continuous-time are studied. It is shown how firing signals, containing firings only on a regular time grid, can be (robustly) memorized with a recurrent network model. In principle, the corresponding weights are obtained by supervised (quasi-Hebbian) multi-pass learning. The asymptotic memorization capacity is a nonvanishing number measured in bits per connection/synapse as its discrete-time analogon. Furthermore, the timing robustness of the memorized firing signals is investigated for different disturbance models.The regime of disturbances, where the relative occurrence-time of the firings is preserved over a long time span, is elaborated for the various disturbance models. The proposed models have the potential for energy efficient self-timed neuromorphic hardware implementations.

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.