ROS Community Metrics

We measure aspects of the ROS community to understand and track the impact of our work and identify areas for improvement. We take inspiration from the MeeGo Project's metrics.

Related: a crowd-sourced map of ROS users around the world.

Ohloh Metrics: Ohloh provides metrics on open source code repositories. See ros stack metrics and follow the related links to see other stacks.

Website with Visualization hosts fine grain visualizations of various metrics from the ROS hosting services.


We periodically publish a metrics report that provides a quantitative view of the ROS community. We expect to publish a report quarterly, though more automation in the data gathering steps could make it feasible to do more frequently.

We're collectively learning what to measure and how. Please provide feedback! Add your suggestions on how to improve these reports below, or post them to

  1. August 2011 report

  2. July 2012 report

  3. August 2013 report

  4. July 2014 report

  5. July 2015 report

  6. July 2016 report

  7. July 2017 report

  8. July 2018 report

  9. July 2018 report

  10. July 2019 report

  11. July 2020 report

  12. July 2021 report


What should we measure differently? Put your suggestions here.

  • Choice of IDE. Also, choice of IDE per programming language.
  • How are wiki edits spread across users?
  • How big are wiki edits?
  • How are answers spread across users?
  • How many commits were made? (requires a multi-VCS crawler)
  • How many tickets were opened/closed? (requires a multi-tracker crawler)
  • Which OS is used? (view count comparison for the installation subpages)
  • what kinds of packages are popular ? (requires tags on packages)
  • How many participants use in large epic projects (e.g. APC, DRC)?

  • How often packages are updated?
  • Number of unique users over time (monthly or quarterly)
  • Number of wiki tutorial pages under in any package

Wiki: Metrics (last edited 2022-02-02 18:55:31 by IsaacSaito)