SOLOMON – Self-Organisation and Learning Online in Mobile Observation Networks

Funding: EU H2020 – Marie Skłodowska-Curie Individual Fellowship
Partners: Aston University, University of Birmingham, Edesix Ltd

Abstract: Smart cameras are embedded devices combining a visual sensor, a processing unit and a communication interface, allowing the processing of images on the device, such that only aggregated information, instead of raw video data, is transmitted. Smart camera networks are typically used for large-scale high value security applications such as person tracking in airports or amusement parks. However, current smart cameras are expensive and have only very limited mobility, acting as a barrier to their wider adoption.
The SOLOMON project is driven by the rising demand for rapid-deployment camera networks which can adapt to provide security in the context of unforeseen situations and unfolding scenarios. This is evidenced by the rapid growth of leading body-cam company Edesix Ltd, whose VideoBadge technology is being adopted by police forces worldwide3. However, recent research advances in smart camera networks have not yet been realised in dynamic body-worn camera networks, and still rely on prohibitively expensive static hardware.
In the SOLOMON project we will develop a novel type of lightweight, inexpensive smart camera network suitable for rapid deployment and reconfiguration, where low-cost camera devices such as Edesix’s VideoBadge, are paired with the processing capabilities of smartphones. These are then worn by people (e.g. police, security guards) or mounted on mobile robots. This not only lowers cost, but allows us to introduce a feedback loop between the sensing cameras and the acting people/robots, enabling the camera network to adapt to changes during runtime, for example to prioritise or cover newly relevant areas, in response to an unfolding situation. Novel techniques in collective decision making and self- organisation as well as multi-objective online learning will be developed, in order to achieve this vision.

CPS/IoT Ecosystem: Preparing Austria for the Next Digital Revolution

Funding: Ministry of Science, Research and Economy (BMWFW): Infrastructure Grant
Partners: TU Wien, Austrian Institut of Technology (AIT), Institute of Science and Technology(IST)

Abstract: Cyber-physical systems (CPS) are spatially-distributed, time-sensitive, multiscale networked embedded systems, connecting the physical world to the cyber world through sensors and actuators. The Internet of Things (IoT) is the backbone of CPS. It connects the Sensors and Actuators to the nearby Gateways and the Gateways to the Fog and the Cloud. The Fog resembles the human spine, providing fast and adequate response to imminent situations. The Cloud resembles the human brain, providing large storage and analytic capabilities.
In this project we will make Austria a major player in Real-Time (RT) CPS/IoT, by building on its national strengths. In collaboration with renowned Austrian companies such as TTTech or ams AG, we will create an RT CPS/IoT-Ecosystem with more than 5000 sensors and actuators, where we can all experiment with new ideas, and develop this way an Austrian know-how. This effort will be aligned with the strategic Austrian initiatives, Industry 4.0 and Silicon-Austria. The ecosystem will be distributed across Vienna in collaboration with our partners at AIT and IST.

EPiCS – Engineering Proprioception in Computing Systems

Funding: EU FP7-ICT
Partners: University of Paderborn (Germany), Alpen-Adria-Universität Klagenfurt (Austria), Austrian Institute of Technology (AIT) (Austria), Eidgenössische Technische Hochschule Zürich (ETH) (Switzerland), Airbus Defence and Space (Germany), University of Oslo (Norway), Imperial College London (UK), University of Birmingham (UK)

Abstract: The EPiCS project aims at laying the foundation for engineering the novel class of proprioceptive computing systems. Proprioceptive computing systems collect and maintain information about their state and progress, which enables self-awareness by reasoning about their behaviour, and self-expression by effectively and autonomously adapt their behaviour to changing conditions. Concepts of self-awareness and self-expression are new to the domains of computing and networking; the successful transfer and development of these concepts will help create future heterogeneous and distributed systems capable of efficiently responding to a multitude of requirements with respect to functionality and flexibility, performance, resource usage and costs, reliability and safety, and security. Innovations from EPiCS are based on systematic integration of research in concepts and foundations for self-aware and self-expressive systems with novel hardware/software platform technologies and architectures for engineering autonomic compute nodes and networks. EPiCS drives and validates the research by the requirements of three challenging application domains that cover both high-end computers and embedded systems, as well as embeddings into technical and non-technical contexts.