ROS Community Metrics

We measure aspects of the ROS community to understand and track the impact of our work and identify areas for improvement. We take inspiration from the MeeGo Project's metrics.

Related: a crowd-sourced map of ROS users around the world.

Ohloh Metrics: Ohloh provides metrics on open source code repositories. See ros stack metrics and follow the related links to see other stacks.

Reports

We periodically publish a metrics report that provides a quantitative view of the ROS community. We expect to publish a report quarterly, though more automation in the data gathering steps could make it feasible to do more frequently.

We're collectively learning what to measure and how. Please provide feedback! Add your suggestions on how to improve these reports below, or post them to http://discourse.ros.org/c/site-feedback.

  1. August 2011 report

  2. July 2012 report

  3. August 2013 report

  4. July 2014 report

  5. July 2015 report

  6. July 2016 report

Wishlist

What should we measure differently? Put your suggestions here.

  • How are wiki edits spread across users?
  • How big are wiki edits?
  • How are answers spread across users?
  • How many commits were made? (requires a multi-VCS crawler)
  • How many tickets were opened/closed? (requires a multi-tracker crawler)
  • Which OS is used? (view count comparison for the installation subpages)
  • what kinds of packages are popular ? (requires tags on packages)
  • How many participants use in large epic projects (e.g. APC, DRC)?

  • How often packages are updated?
  • Number of unique users over time (monthly or quarterly)
  • Number of wiki tutorial pages under in any package

Wiki: Metrics (last edited 2016-10-06 00:51:47 by TullyFoote)