FLOCKD – Federated Learning for Online Collaborative Knowledge and Decision-making
Funding: DFF FTP1
Partners: National University of Singapore, Franklin & Marshall College, University of Parma
The FLOCKD project will investigate the distribution of Deep Neural Networks (DNNs) in smart camera networks, allowing individual cameras in a networked setting to classify and prediction trajectories and actions of observed objects. While DNNs are indeed very successful in identification and prediction tasks, they are resource expensive to train and maintain. To overcome this, federated learning has been proposed, combining the learned models of different devices. However, due to the different perceptions of cameras, a single common DNN might not be viable and individual, specialised DNNs are required. While utilising such individual specialised networks, we will also develop approaches allowing cameras to request feedback from each other by sharing their specialised networks upon request. We hypothesise this will lead to better network-wide inference.
UPSIM – Unleash Potentials in SIMulations
Funding: ITEA 3 Call 6
Partners: 3D Mapping Solutions GmbH, Aarhus University, Agro Intelligence ApS, Atlas Copco Industrial Technique AB, Audi, Automotive Solution Center for Simulation e.V., BEIA Consult International, Calejo Industrial Intelligence AB, Deutsches Zentrum für Luft- und Raumfahrt (DLR), Eindhoven University of Technology, Equa Simulation AB, iCONDU GmbH, In Summa Innovation b.v., Infineon Technologies AG, KE-works BV, Keyland Sistemas de Gestión SL, LifeTec Group BV, Linköping University, LTX Simulation GmbH, Lucian Blaga University of Sibiu, NETCheck S.A., NLR – Royal Netherlands Aerospace Centre, Philips Electronics Nederland BV, Philips Consumer Lifestyle B.V., Reden BV, Robert Bosch GmbH, Saab AB, Scania, SII CONCATEL S.L., Sioux LIME BV, Softwarehelden GmbH & Co. KG, Swedish National Road and Transport Research Institute, Technische Universität Berlin, The Manufacturing Research Centre (MTC), Unit040 Ontwerp B.V., University of Augsburg, University of Groningen, Virtual Vehicle Research GmbH, Virtual Vehicle Research GmbH (coordinator), Volkswagen A.G., Volvo Personvagnar AB
Abstract: Nowadays, simulation is used for design space exploration, virtual testing or predictive maintenance for supporting early stage product decisions. Most importantly, real testing is ultimately used to assure product quality and certification. The aim of UPSIM is to enable companies to safely collaborate on simulations in a repeatable, reliable and robust manner and to implement simulations in a Credible Digital Twin setting as a strategic capability in order for them to become an important factor in quality, cost, time-to-market and overall competitiveness.
MARVEL – Multimodal Extreme Scale Data Analytics for Smart Cities Environments
Funding: EU H2020-ICT-2018-20 (Information and Communication Technologies)
Partners: Idryma Technologias Kai Erevnas, Infineon Technologies AG, Atos Spain SA, Consiglio Nazionale Delle Ricerche, Intrasoft International SA
Fondazione Bruno Kessler, Audeering GmbH, Tampereen Korkeakoulusa atio SR, Primanova Sas, Sphynx Technology Solutions AG, Comune Di Trento, Univerzitet U Novom Sadu Fakultet, Information Technology For Market, Greenroads Limited, Zelus Ike, Instytut Chemii Bioorganicznej Polskiej
Abstract: MARVEL aspires the convergence of a set of technologies in the areas of AI, analytics, multimodal perception, software engineering, HPC as part of an Edge-Fog-Cloud Computing Continuum paradigm that goes beyond traditional Big Data, conventional architectures heavily capitalizing on distributed resources and heterogeneous data sources in smart city environments, while implementing privacy preservation techniques at all data modalities and at all levels of its architecture. The ultimate aim is to support data-driven real-time application workflows and decision making in modern cities, showcasing the potential to address societal challenges very effectively, from increasing public safety and security to analysing traffic flows and traffic behaviour in the cities of Trento and Malta.
COGITO – COnstruction phase diGItal Twin mOdel
Funding: EU H2020-NMBP-ST-IND-2018-2020 – Industrial Sustainability
Partners: Aarhus University, Hypertech (Coordinator), University College London, University of Edinburgh, Ethniko Kentro Erevnas Kai Technologikis Anaptyxis, Universidad Politécnica de Madrid, BOC Asset Management GmbH, Que Technologies IKE, Novitech a.s, ASM – Centrum Badan i Analiz Rynku, Ferrovial Agroman S.A, Olympia Odos Concession Company S.A. For The Motorway Elefsina – Korinthos – Patra – Pyrgos – Tsakona, Rhomberg Sersa Rail Group
Abstract:Towards minimising construction project time/cost overruns and alleviate workplace accidents, COGITO targets to a semantic and pragmatic alignment between novel data capture techniques and delivery of value-adding end-user services leveraging the power of near-real-time data for the timely detection of health & safety hazards to humans, construction quality defects as well as a constantly up-to-date workflow management in order to minimise construction project time/cost overruns and alleviate workplace accidents.
BIM2Twin – Optimal Construction Management & Production Control
Funding: EU H2020-NMBP-ST-IND-2018-2020 – Industrial Sustainability
Partners: Aarhus University, Centre Scientifique Et Technique Du Batiment (Coordinator), Technion – Israel Institute Of Technology, The Chancellor Masters And Scholarsof The University Of Cambridge, Technische Universitaet Muenchen, Institut National De Recherche En Informatique Et Automatique, Fira Group Oy, Intsite Ltd, Fundacion Tecnalia Research, Acciona Construccion Sa & Innovation, Ruhr-Universitaet Bochum, Spada Construction, Universita Politecnica Delle Marche, Unismart Padova Enterprise Srl, Orange SA, Siemens AG, Idp Ingenieria Y Arquitectura Iberia Sl,
Abstract: The use of advanced technology is essential for improving the construction industry by allowing for more efficient management, increased productivity, and reduction of operational waste and carbon footprint. The EU-funded BIM2TWIN project will create a Digital Building Twin (DBT) platform for construction site management using artificial intelligence (AI) and semantic linked data techniques. The platform will provide full situational insight on the as-built product and as-performed processes, which will be used and compared to the as-designed product and as-planned processes through an extensible set of construction management applications to implement a closed-loop Plan-Do-Check-Act process. The full process will rely on multiple onsite sensors for data acquisition and cross-domain analysis, and complex AI-based event processing. The DBT will offer an application programming interface allowing construction management applications to interoperate with its data/information/knowledge bases.
Funding: Innovation Fonden Denmark
Partners: Aarhus University, AgroIntelli, Technical University of Denmark, Danish Technical Institute, Business Region Midtvest
Abstract: AgroRobottiFleet aims to improve efficiency of robots in performing agricultural task. In fleets they are tasked to collaborate and interact. This requires, on one hand, reliable communication but also safety guarantees and mechanisms to ensure they are obeyed. Peter Gorm Larsen and Lukas Esterle from Aarhus University, Department of Engineering, lead the efforts towards adjustable autonomy in agricultural robots.
HUBCAP – Digital Innovation HUBs and Collaborative Platform for Cyber-Physical Systems
Funding: EU H2020 – DT-ICT-01-2019 – Smart Anything Everywhere
Partners: Aarhus University (Coordinator), Newcastle University, Fortiss GmbH, Virtual Vehicle Research Center, Fondazione Bruno Kessler, KTH Royal Institute of Technology, University “Lucian Blaga” of Sibiu, Engineering Ingegneria Informatica S.p.A., Research Institutes of Sweden AB, F6S Network Limited, Politecnico di Milano, Unparallel Innovation, Controllab Products, BEIA Consult, Verified Systems International, Validas, Technology Transfer Systems srl
Abstract: HUBCAP will provide a one-stop-shop for European SMEs wanting to join the Cyber-Physical System (CPS) revolution. It builds on seven established Digital Innovation Hubs (DIHs) in seven European countries, each deeply embedded in its regional digital innovation ecosystem, and offering specialist expertise, experimental capabilities, and focused application domain knowledge in CPS Engineering.
From this base, HUBCAP will create a growing and sustainable pan-European network that will offer SMEs opportunities to undertake experiments, seek investment, and access expertise and training. This will be enabled by a cloud-based open collaboration platform with a ‘sandbox’ capability to help potential users trial new CPS design technology before investment.
SOLOMON – Self-Organisation and Learning Online in Mobile Observation Networks
Abstract: Smart cameras are embedded devices combining a visual sensor, a processing unit and a communication interface, allowing the processing of images on the device, such that only aggregated information, instead of raw video data, is transmitted. Smart camera networks are typically used for large-scale high value security applications such as person tracking in airports or amusement parks. However, current smart cameras are expensive and have only very limited mobility, acting as a barrier to their wider adoption.
The SOLOMON project is driven by the rising demand for rapid-deployment camera networks which can adapt to provide security in the context of unforeseen situations and unfolding scenarios. This is evidenced by the rapid growth of leading body-cam company Edesix Ltd, whose VideoBadge technology is being adopted by police forces worldwide3. However, recent research advances in smart camera networks have not yet been realised in dynamic body-worn camera networks, and still rely on prohibitively expensive static hardware.
In the SOLOMON project we will develop a novel type of lightweight, inexpensive smart camera network suitable for rapid deployment and reconfiguration, where low-cost camera devices such as Edesix’s VideoBadge, are paired with the processing capabilities of smartphones. These are then worn by people (e.g. police, security guards) or mounted on mobile robots. This not only lowers cost, but allows us to introduce a feedback loop between the sensing cameras and the acting people/robots, enabling the camera network to adapt to changes during runtime, for example to prioritise or cover newly relevant areas, in response to an unfolding situation. Novel techniques in collective decision making and self- organisation as well as multi-objective online learning will be developed, in order to achieve this vision.
CPS/IoT Ecosystem: Preparing Austria for the Next Digital Revolution
Abstract: Cyber-physical systems (CPS) are spatially-distributed, time-sensitive, multiscale networked embedded systems, connecting the physical world to the cyber world through sensors and actuators. The Internet of Things (IoT) is the backbone of CPS. It connects the Sensors and Actuators to the nearby Gateways and the Gateways to the Fog and the Cloud. The Fog resembles the human spine, providing fast and adequate response to imminent situations. The Cloud resembles the human brain, providing large storage and analytic capabilities.
In this project we will make Austria a major player in Real-Time (RT) CPS/IoT, by building on its national strengths. In collaboration with renowned Austrian companies such as TTTech or ams AG, we will create an RT CPS/IoT-Ecosystem with more than 5000 sensors and actuators, where we can all experiment with new ideas, and develop this way an Austrian know-how. This effort will be aligned with the strategic Austrian initiatives, Industry 4.0 and Silicon-Austria. The ecosystem will be distributed across Vienna in collaboration with our partners at AIT and IST.
EPiCS – Engineering Proprioception in Computing Systems
Funding: EU FP7-ICT
Partners: University of Paderborn (Germany), Alpen-Adria-Universität Klagenfurt (Austria), Austrian Institute of Technology (AIT) (Austria), Eidgenössische Technische Hochschule Zürich (ETH) (Switzerland), Airbus Defence and Space (Germany), University of Oslo (Norway), Imperial College London (UK), University of Birmingham (UK)
Abstract: The EPiCS project aims at laying the foundation for engineering the novel class of proprioceptive computing systems. Proprioceptive computing systems collect and maintain information about their state and progress, which enables self-awareness by reasoning about their behaviour, and self-expression by effectively and autonomously adapt their behaviour to changing conditions. Concepts of self-awareness and self-expression are new to the domains of computing and networking; the successful transfer and development of these concepts will help create future heterogeneous and distributed systems capable of efficiently responding to a multitude of requirements with respect to functionality and flexibility, performance, resource usage and costs, reliability and safety, and security. Innovations from EPiCS are based on systematic integration of research in concepts and foundations for self-aware and self-expressive systems with novel hardware/software platform technologies and architectures for engineering autonomic compute nodes and networks. EPiCS drives and validates the research by the requirements of three challenging application domains that cover both high-end computers and embedded systems, as well as embeddings into technical and non-technical contexts.