Hi, my name is Sander Staal and I welcome you to my personal website. Since the beginning of 2020, I'm working as a software engineer at Google in the YouTube data team. Previously, before joining Google, I worked as a research assistant at the Perceptual User Interfaces Group, University of Stuttgart (Germany), and at the Distributed Systems Group, ETH Zürich (Switzerland).
During my research time, I worked on perceptually optimized user interface notifications and the quantification of visual attention during everyday mobile phone interactions. I obtained my Master's (2019) and Bachelor's degree (2017) in computer science at ETH Zurich.
No matter how powerful individual systems and CPUs become, there will always be a need to organize and manage large-scale distributed systems. This creates a large range of interesting question, like how to optimize distributed computations, how to design communication models between these systems or how to find new ways and algorithms to better organize and process large-scale datasets.
As smart devices become more ubiquitous, eye gaze can enhance the way in which we interact with objects around us. Many applications can benefit from knowing where the user is currently looking at. Together with other similar aspects from computer vision, like for example scene understanding or object detection, we can create new systems and applications which assist humans in their daily life.
With the ever-increasing number of devices we use within our daily life, we can largely improve the way we interact with them. With new technologies in the fields of wireless communication, wearable computing or location-awareness, we can make them much more aware of their surroundings and context.
With the rise of major new aspects in machine learning during the last 15 years, such as reinforcement learning or the increasing number of neural networks, we can apply these new concepts in combination with computer vision and image processing to create new and innovative systems.