Linköping University

The Linnaeus Center for Control, Autonomy, and Decision-making in Complex Systems (CADICS)

Granted: 55 MSEK
Contact: Lennart Ljung
Website: CADICSexternal link, opens in new window

The Goal of CADICS is to leverage this unique resource and create a research framework which encourages concrete collaboration with specific groups across IT disciplines. CADICS will tackle the challenging new set of research problems which have arisen due to the qualitative jump in the sophistication of complex systems being developed today and also envisioned for the future.

The Linnaeus Center for Control, Autonomy, and Decision-making in Complex Systems (CADICS) brings together five research groups at Linköping University: Artificial Intelligence (Patrick Doherty), Sensor Informatics (Fredrik Gustafsson), Automatic Control (Lennart Ljung), Vehicular Systems (Lars Nielsen) and Scientific Visualization (Anders Ynnerman).

Each of these groups has received international attention and recognition for their work. The results cover a broad spectrum of theory and methodology of basic research character to algorithms and software. All groups have also been involved in practical and industrial applications of various kinds.

The center will work with development of new theory and methodology based on the broad, joint perspectives of the groups. This will be achieved both by focusing on cross-disciplinary problem areas, such as sensor fusion and autonomy, and also by testing and evaluating ideas on concrete practical applications, such as autonomous aerial vehicles, automobiles and medical support functions.

Linnaeus Centre for Research on Hearing and Deafness (HEAD): Excellence in the field of Cognitive Hearing Science

Granted: 50 MSEK
Contact: Jerker Rönnberg
Website: Linnaeus Centre HEADexternal link, opens in new window

This center deals with the field of Cognitive Hearing Science. This is a new, interdisciplinary field that focuses on how hearing-impaired and deaf people deploy cognitive resources to communicate in realistic, everyday situations.

The researchers will chart and model the dynamic interplay in the nervous system between human cognition and auditory signal processing, both aided and unaided. The proposed research will investigate how the brain constructs meaning from degraded and distorted input signals, and how brain plasticity varies in relation to the developmental trajectories of cognition and language across the lifespan.

The communicative consequences of available sensory modality (visual, auditory and audiovisual) and preferred language modality (sign and speech) are studied at neural, cognitive and social levels of analysis to ensure robust modelling.

The malleability of the sensory-cognitive interface is addressed in targeted intervention studies. Model generalization is tested in comparative studies including persons with various diagnoses, such as specific language impairment, dyslexia, and deaf-blindness. These may implicate hearing-based cognitive functions, such as the phonological loop of working memory.

The unique combination of expertise within the HEAD team in both the modelling of different processing requirements, and the application of theory to different client groups is a strong indicator of the expected success of this proposed programme.

Share |
Updated: 2014-05-06