Adding "AI Impact" analytics to the Value Stream Management
### Problem to solve Demonstrating the ROI 💰 of AI investments is a key priority for engineering leadership (Director/VP/CTO), but [measuring AI’s true impact on the SDLC is highly challenging](https://gitlab.com/gitlab-org/gitlab/-/issues/441596#note_2285449715). To effectively quantify AI’s contribution and demonstrate its ROI, they need: - To ensures that AI adoption drives meaningful improvements in the SDLC and understand [which metrics improved](https://gitlab.com/gitlab-org/gitlab/-/issues/414987#note_1649787700 "AI Usage Dashboard") as a result of this investments. - To compare the performance of [teams that are using AI](https://gitlab.com/gitlab-org/gitlab/-/issues/414987#note_1630378937 "AI Usage Dashboard") against teams that are not using AI. - To track the [progress of AI adoption](https://gitlab.com/groups/gitlab-org/incubation-engineering/ai-assist/-/epics/3 "AI Impact - Dashboard MVC") for evaluating the potential of AI usage. - To [automat insights extraction](https://gitlab.com/gitlab-org/gitlab/-/issues/414987 "AI Usage Dashboard") about the "AI Impact" from their organization's large volume of performance data. #### Related feedback and validation: <details><summary>Click to expand</summary> - gitlab#414987 and https://gitlab.com/groups/gitlab-org/incubation-engineering/ai-assist/-/epics/3 - List of customer calls - [1](https://gitlab.com/groups/gitlab-org/-/epics/12978#note_1827730119 'Adding "AI Impact" analytics to the Value Stream Dashboard'), [2](https://gitlab.com/groups/gitlab-org/-/epics/12978#note_1839023344 'Adding "AI Impact" analytics to the Value Stream Dashboard'), [3](https://gitlab.com/groups/gitlab-org/-/epics/12978#note_1888138936 'Adding "AI Impact" analytics to the Value Stream Dashboard'), [4](https://gitlab.com/groups/gitlab-org/-/epics/12978#note_1895887623 'Adding "AI Impact" analytics to the Value Stream Dashboard'), [5](https://gitlab.com/groups/gitlab-org/-/epics/12978#note_2009551026 'Adding "AI Impact" analytics to the Value Stream Dashboard'), [6](https://gitlab.com/gitlab-com/user-interviews/-/issues/32#note_2005392745 "VSM & DORA 2024 customers calls"), [7](https://gitlab.com/gitlab-com/user-interviews/-/issues/32#note_2025983840 "VSM & DORA 2024 customers calls"), [8](https://gitlab.com/gitlab-com/user-interviews/-/issues/32#note_2011935781 "VSM & DORA 2024 customers calls"), - [Enterprise Leaders “Defining Success” 🌟 Study 1](https://docs.google.com/presentation/d/1k9fJoqgkPX2zaK2-yQKydc_Xl3W44u1GIZNLUjQqZ5k/edit#slide=id.g2b14eae506f_0_0) #### Solution validation (lite) During the ROI tracker JTBD research (July-August 2024), participants were also asked to provide their thoughts on the initial [AI Impact dashboard mockup](https://gitlab.com/-/group/9970/uploads/8a515caabab5f00399adf7307a4a7ff1/AI_impact\_-\_C_comparion_over_time.png). - [Dovetail](https://gitlab.dovetail.com/tags/6DbFUfOJtKILKgdQt7mt7D) (internal) ### </details> ### FAQ https://gitlab.com/gitlab-org/gitlab/-/issues/512931+ ### Mapping AI Impact To Business Outcomes https://gitlab.com/gitlab-org/gitlab/-/issues/480070+ ### Unit Primitives Metrics Matrix https://gitlab.com/gitlab-org/gitlab/-/issues/480067+ ### Roadmap **Note:** The entirety of each step may not be worked on in sequential order. Dependencies, customer feedback, and technical feasibility will factor into the specific execution order. The outline of the roadmap is intended to inform general prioritization. - **Step 1a: Provide User-level Adoption Data**. This is easy to measure and answers the immediate customer needs by understanding who uses what features within Duo Enterprise. - TL;DR: "Are my developers using Duo? How are they using it?" - Epic: https://gitlab.com/groups/gitlab-org/-/epics/15026+ - **Step 1b: Provide parity of AI Impact Analytics across all deployment types** - TL;DR: "Whether I'm on GitLab.com, Dedicated, or self-managed, I should be able to see AI Impact Analytics. - Epic: https://gitlab.com/groups/gitlab-org/-/epics/15029+ - **Step 1c: Iterative Improvements To Existing Dashboard Metrics, Data Visualization, & Usability** - TL;DR: "I Need to see more dimensions for the existing metrics (e.g., breakdown of code suggestions by IDE), visualize new vs. returning Duo users, visualize the data in more easily digestible ways such as line and bar charts, and have more controls over data ranges)" - Epic: https://gitlab.com/groups/gitlab-org/-/epics/15030+ - **Step 2: Implement a tiering strategy for AI Impact Analytics in Duo Pro and Duo Enterprise.** Some level of AI Impact Analytics in each SKU is table stakes for Customers. - TL;DR: "Whether I'm purchasing Duo Pro or Duo Enterprise, I need metrics to justify the ROI to justify spending the additional money for the add-on." - Epic: https://gitlab.com/groups/gitlab-org/-/epics/15028+ - **Step 3: Provide insights into the impact of adopting Vulnerability explanation/resolution**. This is easier to measure and answers customers' essential questions regarding the upside of using Duo to explain and resolve vulnerabilities. - TL;DR: "Are we able to more efficiently increase the rate at which we can resolve and mitigate vulnerabilities" - Epic: https://gitlab.com/groups/gitlab-org/-/epics/15024+ - **Step 4: Provide insights into the impact of adopting Root Cause Analysis**: This is easier to measure and answers customers' essential questions regarding the upside of using Duo to fix broken pipelines. - TL;DR: "Are we able to more efficiently fix broken pipelines, reduce the overall rate of pipeline failures, and merge/deploy code at a faster rate" - Epic: https://gitlab.com/groups/gitlab-org/-/epics/15025+ - **Step 5-?: Surface meaningful metrics for each unit primitive in Duo Pro and Duo Enterprise** - \*\*Later: AI Impact Analytics Cohorts" - TL;DR: "What is the overall impact of developers using Duo Pro/Enterprise vs those developers that do not within my organization" --- <details> <summary>Old version of Objectives/Features</summary> # Proposed Objectives/features: <details> <summary>Click to expand</summary> 1. "AI Impact view" - visualize in one view the the relationship between "AI usage" trends and the SDLC trends to demonstrate the ROI of AI adoption. - [Objective 1](https://gitlab.com/gitlab-org/gitlab/-/issues/443698#conceptual-design 'UX: [VSD] "AI Impact view" MVC - visualize the relationship between "AI usage" trends and the SDLC performance.') Users can observe how changes in one metric correlate with changes in others - [Objective 2](https://gitlab.com/groups/gitlab-org/-/epics/13833 'AI Impact analytics - "Overview tiles"') - Analyze the effectiveness of the Code Suggestion functionality. `Code Suggestions: Acceptance Rate %`, `Unique Users`, [enabled](https://docs.gitlab.com/ee/subscriptions/subscription-add-ons.html#assign-gitlab-duo-pro-seats) vs consumed. - [Objective 3](https://gitlab.com/gitlab-org/gitlab/-/issues/443698/designs/AI_impact\_-\_B_Team_comparison.png "AI_impact_-_B_Team_comparison.png"): Compare the performance between AI/Non-AI groups, projects, and teams. - Insight example: "In January, AI adoption increased by 5%, leading to a 10% reduction in cycle times." - This could be in a the [VSD metrics comparison panel](https://docs.gitlab.com/ee/user/analytics/value_streams_dashboard.html). 2. "AI Adoption view" - adding "AI features" adoption view/table. - [Objective 4](https://gitlab.com/gitlab-org/gitlab/-/issues/443699 '[VSD] "AI Adoption view" MVC by adding "AI features" to the DevOps Adoption report'): which users are leveraging AI features and whether their performance has changed over time as a result of the AI usage. Identify `Code Suggestion loaded/accepted`, Non-AI groups. - Objective 5: Track the AI adoption over time. - Insight example: "All groups under "Subgroup A" adopted AI, while "Subgroup B" is still at 50%." - This could be in a the [DevOps Adoption table](https://gitlab.com/groups/gitlab-org/-/analytics/devops_adoption). 3. "AI Adoption insights" - adding algorithms to uncover insights from the correlations between "AI adoption" and "DevOps performance". These insights could be about: - [Objective 6](https://gitlab.com/gitlab-org/gitlab/-/issues/444619 '[VSD] Adding algorithms to uncover insights from the correlations between "AI adoption" and "DevOps performance"'): Highlighting significant correlations of key drivers and opportunities for improvement by adopting AI. - Objective 7: Highlighting patterns and trends in data. - Objective 8: Adding recommendations for improvement. - This could be added as ["Get quick insights" button in VSD](https://gitlab.com/gitlab-org/gitlab/-/issues/389233#suggested-mvc-solution "VSM Dashboard - adding AI Descriptive Analytics ( “Duo insights”) with quick insights"). </details> | Objectives/features | Conceptual designs | |---------------------|--------------------| | [Objective 1:](https://gitlab.com/gitlab-org/gitlab/-/issues/443698#conceptual-design 'UX: [VSD] "AI Impact view" MVC - visualize the relationship between "AI usage" trends and the SDLC performance.') Users can observe how changes in one metric correlate with changes in others | ![AI_impact\<span data-escaped-char\>\<span data-escaped-char\>\_\</span\>\</span\>-\<span data-escaped-char\>\<span data-escaped-char\>\_\</span\>\</span\>C_comparion_over_time](/uploads/8a515caabab5f00399adf7307a4a7ff1/AI_impact_-_C_comparion_over_time.png) | | [Objective 2](https://gitlab.com/groups/gitlab-org/-/epics/13833 'AI Impact analytics - "Overview tiles"'): "Overview tiles" - Providing a clear and concise summary of the metrics, allowing users to quickly assess the current state of the AI features. This include the effectiveness of the Code Suggestion functionality e.g. `Code Suggestions: Acceptance Rate %`, `Unique Users`, [enabled](https://docs.gitlab.com/ee/subscriptions/subscription-add-ons.html#assign-gitlab-duo-pro-seats) vs consumed. | ![overview_tiles1](/uploads/c6c0a52ac5977cc331b27dcd23a0da3b/overview_tiles1.png) | | [Objective 3](https://gitlab.com/gitlab-org/gitlab/-/issues/443698/designs/AI_impact\_-\_B_Team_comparison.png "AI_impact_-_B_Team_comparison.png"): Compare the performance between AI/Non-AI groups, projects, and teams. | ![Screenshot_2024-03-19_at_22.46.33](/uploads/9882e9b3c45f688ab8d9be4e3705d24b/Screenshot_2024-03-19_at_22.46.33.png) | | [Objective 4](https://gitlab.com/gitlab-org/gitlab/-/issues/455860 "UX: AI Impact analytics - user engagement"): understand how users are interacting with AI features. Which users are leveraging AI features and whether their performance has changed over time as a result of the AI usage. | ![Option_D\_-\_Add_a_column_to_show_AI_useage](/uploads/5e811cb68a068d373458389ef0c1ac6a/Option_D_-_Add_a_column_to_show_AI_useage.png) | </details> --- ### Leading metrics \<\> Target metrics: :bar_chart: <details> <summary>Click to expand</summary> #### leading metrics (independent variables): - `AI usage rate` - `Code Suggestions Usage rate` #### Target metrics (Dependent variables or Outcome variables) - `Deployment Frequency` - `Change failure rate` - `Cycle time` - `Lead time` - `Mean time to merge` - `Vulnerabilities` </details> ## Feedback issue We are excited to hear feedback from users! :blush: Please share your feedback or questions [here](https://gitlab.com/gitlab-org/gitlab/-/issues/456105 '"AI Impact" analytics - Feedback Issue 💭👂🔍'). _DISCLAIMER: This epic contains information related to upcoming products, features, and functionality. It is important to note that the information presented is for informational purposes only. Please do not rely on this information for purchasing or planning purposes. As with all projects, the items mentioned in this presentation are subject to change or delay. The development, release, and timing of any products, features, or functionality remain at the sole discretion of GitLab Inc._
epic