Trellis Scores Overview

Trellis Scores are a proprietary scoring mechanism from SEI that helps you understand the productivity of software development teams and individuals. Trellis Scores are calculated from factors such as code quality, code volume, speed, impact, proficiency, and collaboration. It provides the flexibility to customize the weight assigned to each factor, allowing for tailored assessments based on individual profiles. These metrics can be conveniently customized by the user through toggle buttons.

Quality

To assess the quality of work, two essential metrics are utilized: the Percentage of Rework and the Percentage of Legacy Rework.

  • The Percentage of Rework is a key indicator of the modifications made to recently authored code commits within the last 30 days. A lower percentage indicates a higher level of excellence in the Trellis Score. By default, any alterations to code created in the last 30 days are categorized as rework.

  • The Percentage of Legacy Rework measures the adjustments introduced to older code i.e code commits made more than 30 days ago. A decreased percentage is indicative of a more favorable Trellis Score.

By default, all changes to code written in the last 30 days are considered rework.

What is Rework?

Rework is defined as the process of making changes or modifications to existing code, regardless of whether it was recently written or belongs to the older codebase. This includes alterations, fixes, enhancements, or optimizations. The aim of measuring rework is to gauge the degree of code stability, the frequency of necessary changes, and the efficiency of development efforts.

Impact

Impact is defined by two metrics: High impact bugs worked on per month and high impact stories worked on per month. High Impact refers to the classification of bug or story tickets, based on their perceived significance or priority. This classification can vary depending on user preferences.

  • High Impact bugs worked on per month: This measures the number of resolved, high-impact bug tickets that an engineer was assigned to in the last 30 days. When multiple developers collaborate on the same bug ticket, credit is distributed proportionately among them. It is computed by dividing the total number of high-impact bug tickets resolved by the engineer by the number of months.

  • High Impact stories worked on per month: This measure represents the number of resolved, high impact story tickets that the engineer was assigned to in the last 30 days. Stories refers to any task or issue that is not classified as a bug. If more than one engineer worked on the same ticket, engineers are credited proportionately.

High Impact depends on the Investment profile. The metrics that contribute to the Impact factor require categories from the Investment profile as input parameters. To configure this, go to Configure the Impact Factor in a Trellis profile.

Volume

Volume measures the quantity of code that the developer is working on. The default volume score is calculated using the following six metrics:

  • Number of PRs per month: This measures the number of merged or closed Pull Requests (PRs) submitted by an engineer in the last 30 days.

  • Number of Commits per month: This is the number of commits an engineer has contributed in the last 30 days. The commits associated with the Pull Requests also include direct commits linked to the main branch.

  • Lines of Code per month: This metric measures the number of code lines (based on commits) contributed by an engineer in the last 30 days.

  • Number of bugs worked on per month: This is the number of resolved bug tickets assigned to an engineer within a user-defined time period. If more than one engineer worked on the same ticket, developers are credited proportionately.

  • Number of Stories worked on per month: This is the number of resolved story tickets assigned to an engineer within a user-defined time period. If more than one engineer worked on the same ticket, developers are credited proportionately.

  • Number of Story Points worked on per month: This is the number of resolved story points assigned to an engineer within a user-defined time period. If more than one engineer worked on the same ticket, engineers are credited proportionately.

Speed

Speed measures the pace at which engineers successfully resolve or close the tickets assigned to them. Speed is determined by the following three metrics:

  • Average Coding Days per Week: A coding day is any day where a developer commits code. The recommended goal for coding days per week is 3.2 days. Calculation: The average coding days is computed by dividing the total number of coding days by the number of weeks under consideration This metric quantifies the consistency of developers in actively contributing code to the codebase. Higher values indicate frequent code commits, contributing to faster development.

  • Average PR Cycle Time: This represents the time elapsed from PR creation to closing. The average PR cycle time should be less than 7 days. Calculation: The average PR cycle time is calculated by finding the time difference between the PR closed time and the PR created time for each closed PR.

  • Average Time Spent Working On Issues: This is the average time spent on each issue resolved within a user-defined time period. This typically doesn't include time spent in the Done status. Time is counted only when the developer is assigned to an issue. The average time spent working on issues should be between 3 and 5 days. Calculation: The average time spent working on issues is calculated by dividing the total time by the total number of issues recorded per period.

Proficiency

Proficiency measures the breadth and diversity of tasks or projects that an engineer is currently engaged in or capable of handling effectively. It is based on two metrics: Technical breadth and repo breadth.

  • Technical Breadth: This is the number of unique file extensions that were worked on within a user-defined time period. It is recommended that technical breadth average between 2 and 3 unique files per month.

  • Repo Breadth: This is the number of unique repositories with successful code commits. It is recommended that a developer works on between 2 and 3 unique repos per month.

Leadership and Collaboration

Leadership and collaboration measures developer teamwork and contribution to peer reviews. This is calculated from the following four metrics:

  • Number of PRs approved per month: This number represents how many PRs a developer approved in the last 30 days. The recommended number of approved PRs is between 2 and 7.

  • Number of PRs commented on per month: This number represents how many PRs a developer commented on in the last 30 days. The typical range for this value is between 2 and 7 PRs per month.

  • Average Response Time for PR approvals: This is the average time taken to approve another developer's PR. The industry standard for an PR approval time is between 0.75 and 2 days.

  • Average Response Time for PR comments: This is the average time taken for a developer to add review comments on a PR. The industry standard for responding to a PR comment is between 0.75 and 1.5 days.

Modifying Factors and Weights

The Trellis Score allows for the modification of factors and their corresponding weights to customize the impact of each factor on the final score. This process enables a more tailored assessment of an engineer's performance.

Factors

In the Factors and Weights section of the Trellis profile, individual factors can be enabled or disabled. Enabling a factor includes it in the Trellis Score calculation, while disabling it excludes it from the calculation. You can typically set benchmarks or thresholds at the factor level. These benchmarks define specific performance levels that need to be met or exceeded for a given factor to positively impact the overall score.

Weights

Each factor's weight can be adjusted to assign varying levels of importance. Weights are assigned on a scale of 1 to 10, with 1 indicating low importance and 10 indicating high importance. Relative weights determine the significance of factors in relation to each other.

For example, if all factors are assigned a weight of 5, they contribute equally to the Trellis Score. Adjusting weights allows developers to prioritize certain factors over others based on their organizational goals.

Trellis Score widgets and reports

Trellis Score widgets and reports provide valuable insights into developer performance and facilitate data-driven decision-making. Add these widgets to your Insights to analyze Trellis Scores.

  • Trellis Score Report: Trellis Scores by developer.

  • Trellis Scores by Collection: Trellis Scores organized by Collection.

  • Individual Raw Stats: A table of base values that contribute to Trellis Scores.

  • Raw Stats by Collection: Base values organized by Collection.

Raw stats

The Individual Raw Stats and Raw Stats by Collection widgets shows tables of metric values that contribute to Trellis Scores.

By default, the Individual Raw Stats widget shows the following raw, pre-calculation values for each developer:

  • PRs

  • Commits

  • Coding days

  • Average PR cycle time (in days)

  • Average issue resolution time (in days)

  • PRs commented on

  • PRs approved

  • Percentage of rework

You can edit the widget to show different values (add/remove columns) or apply filtering.

Individual Raw Stats

You can also use the Download icon to download the raw stats report.

Use the Download icon to download the Individual Raw Stats data.

Associations & Advanced Options

This allows users to associate profiles with specific Projects and Collections within an organization. This helps in managing and applying profiles effectively.

  • Projects: Contributors can select a project to view available Collections and assign Trellis profiles to specific projects. This feature helps in organizing and categorizing teams under different projects, allowing for a more granular management of Trellis profiles.

  • Collections: Contributors can select Collections that may apply a Trellis profile. By associating Collection with Trellis profiles, contributors can ensure that the right profiles are applied to the appropriate teams or units within the organization. This facilitates customized score calculations.

Exclusions

Exclusions allow users to exclude specific Pull Requests (PRs) and commit from the current Trellis profile, potentially impacting score calculations.

  • Excluding Pull Requests: Users can specify PRs to be excluded from the current Trellis profile. Exclusions are helpful when certain PRs or commits should not be considered in score calculations, such as those related to experimental or non-standard work.

  • Excluding Commits: Users can list commits to be excluded from the current Trellis profile. Excluded commits are not factored into score calculations.

Development Stages Mapping

Development Stages Mapping allows users to map a Trellis profile with development stages from an issue management tool. Mapping development stages is valuable when organizations want to attribute scores to developers based on their contributions at different stages of a project's life cycle.

These features enhance the customization and precision of performance assessments within an organization using Trellis Scores.

Last updated