hi! I'm
Carolina Brum

User Experience
Natural Movement Processing

I use human-driven data, usually derived from wearable or embedded sensor, and human movement modelling to design algorithms for endpoint prediction, gesture recognition, activity recognition, interaction prediction, and HCI modeling. For this, I utilize signal processing, machine learning, sensor fusion, and human movement modeling techniques.


In recent years, new sensing technologies have emerged for assessing human motion, interaction and behaviour. Human driven signals are challenging as they are unpredictable, complex, diverse and therefore hard to model.

How to validate a sensing technology, how to determine its accuracy and value? Each sensor has its limitations and artefacts and characterizing them is key for hindering their effects. How to setup cross functional efforts including hardware and design teams to highlight strengths and mitigate limitations?


Prediction filters, navigation, localization, detection, tracking, classification and mapping algorithms tailored for human movement and interaction are the core of my work.


Human-robot interaction, self-driven cars and gesture-based controllers require capable technology to capture human intent and to handle the unpredictable nature of human behaviour. How to make sensing less reactive and offer active smart systems?

user experience

How to join human comfort and measurement accuracy? More and more, users are critical and selective on what to use, wear and which devices they are willing to have at home and work. I pair with product designers and UX researchers in order to find where, when, how and what to measure?


I run experiments to onboard users to new experiences, evaluating their effort and satisfaction while using new controllers. I also design and execute experiments to gather data for machine learning. I deliver end-to-end human experiment solutions, from experiment protocol design to statistical analysis and data visualization.

natural movement processing

What vocabulary of gestures or interaction work? How learnable an interaction is? What is the feasible region that takes into account interaction accuracy, computation cost, power, user effort (mental/physical) and satisfaction? What dynamic model better describe a given human motion and interaction? How robust it is?

Reach me at carolina AT carolinabrum.com


Carolina is wearing a white t-shirt and her hair color is red and green.
Dr. Carolina Brum is a Senior Research Scientist at the Machine Intelligence Sensing group at Apple, team behind the handwashing and assistive touch features. She earned her PhD in Music Technology and Integrated Sensor Systems at McGill University, in collaboration with the Responsive Environments Group at the MIT Media Lab where she developed novel sensor fusion and filtering algorithms for wearable sensors. Her work has been published at top journals, such as IEEE Sensors and Open Sensors. Carolina was instrumental in developing robust real-time tracking and prediction algorithms for the Soli radar sensor, which has been deployed on the Google Pixel 4 and the Nest Thermostat. At Chatham Labs, now Facebook Reality Labs, Carolina developed a learning regression algorithm for six degree-of-freedom endpoint prediction for ray pointing in VR. Her research interests include sensing, algorithms, gesture analysis and human interaction. [selected publications]
Website content is © Carolina Brum Medeiros 2021. All rights reserved. Please request permission to copy or redistribute via e-mail. Powered by Barebones