In-car phone use detection now ready for deployment

The number of car incidents is rising again, after years of steady decline. One explanation is car drivers’ use of phones. The Dutch police is now introducing automated ticketing for drivers holding a phone, based on camera footage. The pilot for this technique has been developed in 2018 by my MSc student Jannes Elings. The Dutch police has continued to develop the technique, which is now ready for deployment on the public roads.

Popular Dutch media NOS (Acht uur journaal), RTL (RTL Nieuws) and nu.nl have reported on this breakthrough.

2019_incar_phone_use

In-car phone use – Photo courtesy of the Dutch police

Georgios Kapidis Multi-Task Learning paper at ICCV-EPIC workshop

Georgios Kapidis will present his recent work on Multi-Task Learning for the recognition of actions in ego-centric videos at the EPIC workshop of ICCV. By predicting multiple outputs such as gaze targets and hand locations, we can improve the network structure which benefits other tasks such as action recognition. We demonstrate state-of-the-art results on the EGTEA Gaze+ dataset, and show nice improvements over only action recognition on EPIC Kitchens.

2019_iccv-epic_kapidis

Alex Stergiou has three papers accepted!

Alex Stergiou had a good run: three papers were accepted recently at ICIP (paper), ICMLA (paper) and the ICCV workshop on Interpreting and Explaining Visual Artificial Intelligence Models (paper). The papers deal with the visualization of what 3D CNNs learn for video recognition. Congrats on the acceptance!

Moreover, the survey on vision-based analysis of human-human interactions has appeared in Computer Vision and Image Understanding (CVIU). It can be downloaded for free.2019_alex_iccvw

Two seed money projects awarded

Two project proposals of my colleagues and I have been selected for the Research IT innovation fund 2019 of Utrecht University. VIABLE (PI: Ronald Poppe, co-applicants Chantal Kemner, Maja Dekovic, Albert Ali Salah, Chris Dijkerman, Jorg Huijding) will address the development of tools for the automated analysis of parent-child interactions.

DEEP (PI: Marco Helbich, co-applicants: Daniel Oberski, Ronald Poppe) addresses the automated detection of green zones from street view data using CNNs.

It will be great to work on these novel topics that present many opportunities for further research and development.