Features
A modular collection of R packages for every stage of movement analysis for ethologists, behavioural ecologists, neuroscientists and anyone else who might need it.
Read from Anywhere
Support for both video-based tracking tools such as DeepLabCut, SLEAP, idtracker.ai and many others, and hardware-based data from trackballs.
VERSATILEFlexible Toolkit
Designed as a tidyverse‑friendly suite of modular R packages that together form a unified workflow. Start with the single component you need, and then add others as new challenges arise.
MODULARThoroughly tested
All packages use testthat and codecov to ensure that our code is thoroughly tested so you can trust your findings.
TRUSTEDGet started with animovement
Install
Install the development version of animovement with:
install.packages('animovement',
repos = c(
'https://animovement.r-universe.dev',
'https://cloud.r-project.org'
)
)
Load
After installing, load the package in your R session with:
library(animovement)
Learn
Follow our tutorials to learn more about the powerful features of animovement:
Packages
animovement is an ecosystem of R packages for every stage of movement analysis.
Read about the individual packages here.
aniframe
Core data structures
Provides S3 classes and methods for representing and manipulating movement data with consistent, intuitive interfaces.
aniread
Reading and writing
Methods for reading and writing movement data from diverse sources, from video tracking systems to trackballs.
anicheck
Diagnosing data quality
Diagnostic tools for assessing data quality, identifying missing values, temporal gaps, and spatial outliers.
aniprocess
Signal processing and filtering
Methods for filtering, smoothing, and processing movement trajectories with advanced signal processing techniques.
animetric
Calculating metrics
Calculate movement-based metrics including kinematics, navigational statistics, and social interaction measures.
anivis
Visualization methods
Plotting methods for movement trajectories, diagnostic summaries, and calculated metrics with publication‑ready outputs.