Behavior over claims
The benchmark focuses on how aiming actually unfolds over time. I care less about what a device claims to be, and more about whether the movement profile looks human, controller-native, or translated from another input source.
This thesis project looks at how people aim in FPS-style tasks, and whether those movement patterns can help separate real controller play from mouse input that has been translated to look like a controller. In simple terms, I am trying to study how the aim actually behaves, not just which device a session claims to use.
The benchmark is designed as a short, repeatable task with fixed durations and consistent goals. That makes it easier to compare players, runs, and device modes in a structured way while still preserving realistic moment-to-moment behavior.
The benchmark focuses on how aiming actually unfolds over time. I care less about what a device claims to be, and more about whether the movement profile looks human, controller-native, or translated from another input source.
Some adapters can make mouse input resemble controller input, which can create aim-assist advantages in games built around fair controller play. The goal here is to understand those patterns better without relying only on whatever device label gets reported.
Sessions record score progression, hits, misses, timing, input traces, and key gameplay events in small batches. That means even abandoned or partial runs still contribute useful evidence for later analysis.
Score, shots fired, shots hit, accuracy, and benchmark mode length.
Mouse movement, clicks, keys, controller sticks, triggers, and button activity depending on the run type.
Pause/fullscreen transitions, movement-zone events, target interactions, timestamps, session identifiers, and related metadata.
Data is uploaded continuously while you play. If you stop early, everything that was already uploaded is kept and the session is marked as abandoned instead of being deleted. That makes interruptions and incomplete runs part of the dataset instead of hiding them.
A grounded version of the problem this benchmark is trying to study.
A player might physically use a mouse, keyboard, or controller. That raw movement has its own texture: mouse movement tends to be sharp and high-frequency, while controller movement tends to be smoother and bounded by stick mechanics.
Some devices sit between the real input source and the game, then convert mouse movement into controller-like stick signals. To the game, that can look like controller input even though the physical movement started as mouse input.
Even when the reported device says controller, the resulting aim path may still carry mouse-like traits. This benchmark is meant to collect enough time-series detail to study that gap between reported device and observed behavior.
You play a short aiming benchmark. The platform records how your input and performance change over time, then I use those patterns to study fair-play questions around controller behavior, translated input, and reproducible gameplay telemetry. The goal is to make it easier to reason about suspicious input behavior without reducing everything to a single score or a single device label.