On the relevance of discrepancy norm for similarity-based clustering of delta-event-sequences
|Autoren|| Bernhard Moser|
|Editoren|| A. Quesada-Arencibia et al.|
|Titel||On the relevance of discrepancy norm for similarity-based clustering of delta-event-sequences|
|Buchtitel||EUROCAST 2013 Computer Aided Systems Theory Extended Abstracts|
|Verlag||IUCTC Universidad Las Palmas|
The way in which biological neurons respond on stimuli follows a threshold-based sampling scheme. Rather than sampling equidistant in time, as classical sampling is designed, this sampling is triggered by the event whether the intensity of a signal surpasses a threshold or not. Inspired from biology this sampling principle has also been studied and used for technical applications. In the signal processing context this sampling principle is known as on-delta-send, Lebesgue, level or event based sampling. Reasons for studying and introducing level-based sampling are a) the reduction of the amount of data transfer e.g. in wireless sensor networks and b) the realization of high-dynamic ranges for bio-inspired sensory systems like Silicon Retina or Silicon Cochlea. First of all the geometric structure of the space of event sequences resulting from on-delta-send sampling is studied. It is shown that a recently published result from discrete geometry can be applied in order to characterize its geometry which turns out to be closely related to the so-called discrepancy measure. The discrepancy measure goes back to Hermann Weyl and was proposed in the context of measuring irregularities of probability distributions. It turns out that this measure satis?es the axioms of a norm which distinguishes by monotonicity and a Lipschitz property of its auto-misalignment function. Applications of the discrepancy measure can be found in the ?eld of numerical integration, especially for Monte Carlo methods in high dimensions or in computational geometry and image processing.