Tracking and analyzing Hexbug locomotion using DeepLabCut to understand collective motion patterns and emergent behaviors in mechanical active matter systems.
Understanding the dynamics of biological and mechanical systems often requires tracking and analyzing their motion. This summer research project explores the movement of Hexbugs, small autonomous robots mimicking biological motion, within a controlled circular arena. By applying DeepLabCut (DLC), an open-source machine learning tool used for markerless tracking of moving objects, we were able to track their paths and analyze the underlying behavioral patterns.
The research demonstrates how models trained on a few annotated frames can generalize to predict trajectories across entire video datasets, enabling in-depth motion analysis with minimal manual labor. This approach leverages DeepLabCut's deep learning framework for markerless pose estimation and automated trajectory tracking of multiple Hexbugs simultaneously, providing a scalable solution for motion analysis.
Our analysis employs two key metrics: Mean Squared Displacement (MSD) to quantify motion patterns, and the Order Parameter (Φ) to measure collective behaviors. Through this quantitative analysis, we discovered unexpected intermittent synchronization and clustering behaviors resembling natural swarm dynamics observed in biological systems such as bacterial colonies and flocking birds.
This work provides insights into how machine learning can efficiently track and analyze active matter systems, with implications for understanding collective motion in both mechanical and biological contexts. The trained model successfully tracked Hexbugs with high accuracy, generating clean and continuous trajectories that reveal complex motion patterns within the circular arena.
A systematic approach combining experimental design with machine learning analysis
The Hexbugs were placed in a circular arena with an enclosing ring to simulate biological motion patterns. This controlled environment allowed us to observe and record their interactions and movement behaviors systematically. The circular boundary created constraints that influenced the collective dynamics of the system.
An overhead camera was mounted to capture high-resolution movement patterns for comprehensive analysis. The recording setup ensured consistent lighting and frame rate throughout the experiments, which was critical for accurate tracking and analysis of the Hexbug trajectories.
Manual labeling of key points on a small subset of frames was performed to train the neural network. This initial annotation phase required careful identification of tracking points on each Hexbug, establishing the foundation for the machine learning model to learn spatial relationships.
DeepLabCut's convolutional neural network was trained to learn spatial relationships and key point recognition. The model utilized transfer learning from pre-trained networks, allowing it to generalize from the limited labeled data to predict positions across entire video sequences with high accuracy.
Once trained, the model applied predictions across entire video datasets with high accuracy. This automation eliminated the need for manual frame-by-frame tracking, dramatically reducing the time required for analysis while maintaining consistency and precision in the measurements.
We calculated Mean Squared Displacement (MSD) and Order Parameter to quantify motion patterns and collective behavior. These metrics provided quantitative insights into the diffusive versus directed motion characteristics and the degree of alignment among the Hexbugs over time.
Video recordings and data visualizations from our experiments with different Hexbug configurations
Raw video footage showing 5 Hexbugs in the arena before DeepLabCut tracking
Tracked video output with predicted positions and trajectories overlaid by DeepLabCut
Raw video footage showing 7 Hexbugs in the arena with increased agent density
Tracked video with predicted positions demonstrating multi-agent tracking capabilities
Insights from trajectory analysis and behavioral pattern recognition
The trained model successfully tracked multiple Hexbugs simultaneously, generating clean and continuous trajectories that reveal complex motion patterns within the circular arena. The trajectory plots demonstrate the model's ability to maintain consistent tracking even when Hexbugs moved in close proximity to one another or temporarily overlapped in the video frame.
Mean Squared Displacement analysis revealed that Hexbugs exhibited a mix of diffusive and directed motion patterns, with behavior varying based on boundary conditions. The MSD calculations showed transitions between ballistic motion when Hexbugs moved freely and confined motion when they encountered the arena walls or other obstacles.
Order parameter analysis indicated varying degrees of collective alignment, influenced by initial conditions and the number of Hexbugs in the arena. The data revealed periods of synchronized movement alternating with more chaotic phases, suggesting emergent coordination despite the lack of explicit communication between the mechanical agents.
Detailed implementation of data analysis and visualization
All Python code used for trajectory analysis, MSD calculations, order parameter computation, and visualization generation is available in a separate repository. The code includes data processing scripts, metric computation functions, and plotting utilities used to generate the figures shown in this research.
Visit the HexBugTracking repository to see the complete Python implementation, including README documentation with code examples and usage instructions.
Mathematical frameworks for quantifying motion and collective behavior
Quantifies how far Hexbugs move from their initial positions over time, where r(t) represents position at time t and τ is the lag time.
• Linear MSD → Ballistic motion
• MSD ∝ τ → Diffusive behavior
• Sub-linear MSD → Confined motion
Measures collective directionality and alignment, where vᵢ is the velocity vector of Hexbug i and N is the total number of Hexbugs.
• Φ ≈ 1 → Strong alignment (collective motion)
• Φ ≈ 0 → Random/chaotic motion
• Intermediate values → Partial coordination
Collaborative effort between students and faculty at Augsburg University
Research Collaborator
Research Mentor & Advisor
Augsburg University