A boost for data processing and extraction ‘on the fly’

A quicker way of gathering and processing input data is destined to result in more efficient equipment for the biomedical field and beyond.

In a world seeing an explosion of information, the task of gathering and extracting the right data is pivotal to create more powerful high-tech applications and design faster equipment. One important method for accomplishing this task computationally is by removing redundant data through improved sparse modelling, a rapidly developing area that brings together statistics, machine-learning and signal processing. In computing terms, sparse models comprise mostly zeros and only a few nonzero parameters, exploiting novel theoretical and algorithmic tools to achieve its aims.

Against this backdrop the EU-funded SOL (Sparse Online Learning) worked on developing new theory and algorithms for sparsity-aware learning ‘on the fly’. Instead of relying on storing data and then treating it, the project sought to process it immediately in real time as it becomes available.

To achieve its aims the project team elaborated the required sparsity-aware algorithms in a manner that enables effective real time operation. It integrated advanced sparsity structures in the online learning framework and enhanced the learning process by gathering data from multiple-sensor devices and topologies that exploit joint sparsity structures. This involved the development of a platform to accurately assess the newly articulated techniques against other players or competitors in the field.

Applying the newly developed techniques to the biomedical industry, SOL developed an innovative wireless electrocardiogram (ECG) monitoring system that is more energy efficient and potentially more powerful than existing ECG technology.

The team also expanded to more general cases where sparsity and advanced structures are employed in data matrix factorisation and analysis. It investigated general tasks involving robust subspace tracking, online and distributed dictionary learning and dictionary learning-based matrix factorisation in functional Magnetic Resonance Imaging (fMRI) analysis.

New alternative mathematical tools involving randomised projections were applied to dimensionality reduction and adapted to previously developed algorithms, reducing computational time for fMRI data analysis. The team subsequently developed a novel robust linear regression method based on randomised projections for large data applications.

The new algorithms and techniques for modelling, analysing and/or reconstructing signals have proven very useful, particularly since they operate in an online rather than in a batch fashion. They can process large amounts of data very efficiently, opening the door to many new and emerging applications that require such complexity. The project’s valuable results were published in leading journals, several book chapters and many conference publications.

last modification: 2016-04-29 13:28:07
Comments


Privacy Policy