SlideShare a Scribd company logo
Optimizing Dev Portals with
Analytics and Feedback
James Noes - PM
2
2
To optimize a product with analytics and feedback, we
must follow practices that emphasis the importance of
analytics and feedback.
3
Overview
• Unique challenge of opportunity for Dev Portals
• Side effects of having too many opportunities
• How might we use experiments to help us maintain good practices and better
optimize the experience
4
Many Opportunities
Partner Integrations
Accounts
Blog
Catalog
Newsletter
Documentation
FAQ
Analytics
What’s New
Try it Now
Security
Community
Notifications
Support
Events
Search
Feedback
API Management
Forums
Reporting
Use Cases
Success Stories
Terms of Service
Payments
RBAC
Governance
Health & Status
About Us
Standards
SDKs
Tutorials
Policies
Getting Started
Dev Tools Social Media Release Notes
Roadmap
Pricing
Administration
Videos
Scalability
Reliability & Performance
Developer Experience
Product Pages
API Access
Test Automation
Frameworks
Dependencies
Database Management
Tech Debt
Insights
Best Practices
Design
Open Source
Monitoring & Alerting
Infrastructure
Accessibility
API Reference
Careers Marketing
News & Press
Testing
Monetization
5
Prioritization
Partner Integrations
Accounts
Blog
Catalog
Newsletter
Documentation
FAQ
Analytics
What’s New
Try it Now
Security Community
Notifications
Support
Events
Search Feedback
API Management
Forums
Reporting
Use Cases
Success Stories
Terms of Service
Payments
RBAC Governance
Health & Status
About Us
Standards
SDKs
Tutorials
Policies
Getting Started Dev Tools
Social Media
Release Notes
Roadmap
Pricing
Administration
Videos
Scalability
Reliability & Performance
Developer Experience
Product Pages
API Access
Test Automation
Frameworks
Dependencies Database Management
Tech Debt
Insights
Best Practices
Design
Open Source
Monitoring & Alerting
Infrastructure Accessibility
API Reference
Careers
Marketing
News & Press
Now Next & Later
Testing
Monetization
6
Side effects of many opportunities
• Creating a detailed long-term roadmap
• The habit of “checking things off the list”
• Not questioning return on investment
• Building to meet stakeholder requirements
• Focusing on outputs, not outcomes
• Less emphasis on analytics and feedback
7
Traditional vs. Modern Product Development
Product Development
(Modern)
Feature Factory
(Traditional)
8
How might we avoid it?
9
What's the difference?
Foundation & Features Experiments
• Minimal uncertainty
• Starts with “We know by doing.. We will..”
• May take weeks, months, or even years
• Typically checked off “the list” because they
have less uncertainty
• Only revisited when impacted by another
feature
• Moderate-high uncertainty
• Starts with “We believe by doing.. We expect..
• Timeboxed to emphasis return on investment
• Helps us refine “the list” with data
• Structured to emphasis assessing outcomes,
learning, and iteration
10
Example Experiment
Hypothesis
We believe that by providing use cases for our API Products, that we will
increase in active apps (consumer).
Scope/Requirements
• Limit development and testing to 1 sprint
• Experiment will last 2 months (no changes to use cases)
• No automated testing required
• Accessible in top 20% of homepage
• Use cases must have CTAs
Key Measures
• Number of active apps
• Unique Views
• Time on Page
• CTA Engagement
Success Criteria
• 6% increase in active apps over previous 2 months
• 25% of new users engage with use cases
• Time on page = at least 70% of read time
• 30% of users click CTA
11
Traditional vs. Modern Product Development
Product Development
(Modern)
Feature Factory
(Traditional)
12
Assessing Results
Success Criteria
• 6% increase in active apps over previous 2 months
• 25% of new users engage with use cases
• Time on page = at least 70% of read time
• 30% of users click CTA
Scenario 1
• 8% increase in active apps
• 60% of new users engage with use cases
• Time on page = 90%
• 50% of users click CTA
Experiment was successful – it is obvious that use
cases contributed to outcome.
Scenario 2
• 1% increase in active apps
• 8% of new users engage with use cases
• Time on page = 20%
• 5% of users click CTA
Experiment was unsuccessful – unlikely that
will have an impact. Should discuss removing use
from application to avoid a cluttered experience.
Scenario 3
• 3% increase in active apps
• 10% of new users engage with use cases
• Time on page = 100%
• 80% of users click CTA
Experiment did not achieve expected results -
that use case could be effective. Consider
experiment with placement.
13
Recommendations
• Determine how much of your time you can afford to spend on
experimentation (20% or less to start)
• Bring the experimental mindset back into new feature development
• Always have multiple key measures (3-4 is typically best)
• Focus on experiments as a learning opportunity
• Embrace learning – avoid output as a measure of success
• Unsuccessful experiments are a success if you minimize investment and
learn
Thank you!

More Related Content

PDF
Test Automation using UiPath Test Suite - Developer Circle Part-1.pdf
PPTX
Beyond Simple A/B testing
PDF
Learn Key Insights from The State of Web Application Testing Research Report
PPTX
Lean LaunchPad: Analytics Workshop
PDF
SAP Automation with UiPath: SAP Test Automation - Part 5 of 8
PPTX
QM in Software Projects
PPTX
Product Driven Growth from Lean Product Meetup
PPTX
UiPath Community Session - SAP Test Automation with UiPath.pptx
Test Automation using UiPath Test Suite - Developer Circle Part-1.pdf
Beyond Simple A/B testing
Learn Key Insights from The State of Web Application Testing Research Report
Lean LaunchPad: Analytics Workshop
SAP Automation with UiPath: SAP Test Automation - Part 5 of 8
QM in Software Projects
Product Driven Growth from Lean Product Meetup
UiPath Community Session - SAP Test Automation with UiPath.pptx

Similar to Optimizing Dev Portals with Analytics and Feedback (20)

PPTX
Test Everything: TrustRadius Delivers Customer Value with Experimentation
PPTX
UCO16 - An Independent Evaluation of Third-Party SharePoint Analytics Offerings
PDF
[Webinar] Visa's Journey to a Culture of Experimentation
PPTX
Top Business Benefits of Application Lifecycle Management (ALM)
PPTX
SPSCT15 - An Independent Evaluation of Third-Party SharePoint Analytics Offer...
PDF
Unleashing change impact mining for sap dev ops
PPTX
SPSNYC15 - An Independent Evaluation of Third-Party SharePoint Analytics Offe...
PDF
The Leaders Guide to Getting Started with Automated Testing
PPTX
6 Guidelines for A/B Testing
PPTX
A comprehensive guide on advantages, methods, and process of Usability Testin...
PPTX
Patrick McKenzie Opticon 2014: Advanced A/B Testing
PPT
The Good, The Bad, and The Metrics
PDF
Beyond "Quality Assurance"
PPT
Agile Methods: Fact or Fiction
PDF
Methodology: IT test
PDF
10 Best Practices for Magento Maintenance and Support
PDF
Magento maintenance
KEY
Dev's Guide to Feedback Driven Development
PPTX
Best Practices for Benchmarking the Website User Experience featuring Measuri...
PDF
Anton Muzhailo - Practical Test Process Improvement using ISTQB
Test Everything: TrustRadius Delivers Customer Value with Experimentation
UCO16 - An Independent Evaluation of Third-Party SharePoint Analytics Offerings
[Webinar] Visa's Journey to a Culture of Experimentation
Top Business Benefits of Application Lifecycle Management (ALM)
SPSCT15 - An Independent Evaluation of Third-Party SharePoint Analytics Offer...
Unleashing change impact mining for sap dev ops
SPSNYC15 - An Independent Evaluation of Third-Party SharePoint Analytics Offe...
The Leaders Guide to Getting Started with Automated Testing
6 Guidelines for A/B Testing
A comprehensive guide on advantages, methods, and process of Usability Testin...
Patrick McKenzie Opticon 2014: Advanced A/B Testing
The Good, The Bad, and The Metrics
Beyond "Quality Assurance"
Agile Methods: Fact or Fiction
Methodology: IT test
10 Best Practices for Magento Maintenance and Support
Magento maintenance
Dev's Guide to Feedback Driven Development
Best Practices for Benchmarking the Website User Experience featuring Measuri...
Anton Muzhailo - Practical Test Process Improvement using ISTQB
Ad

More from Pronovix (20)

PDF
By the time they're reading the docs, it's already too late
PPTX
Success metrics when launching your first developer portal
PDF
Documentation, APIs & AI
PDF
Making sense of analytics for documentation pages
PPTX
Feedback cycles and their role in improving overall developer experiences
PDF
GraphQL Isn't An Excuse To Stop Writing Docs
PPTX
API Documentation For Web3
PDF
Why your API doesn’t solve my problem: A use case-driven API design
PDF
unREST among the docs
PDF
Developing a best-in-class deprecation policy for your APIs
PDF
Annotate, Automate & Educate: Driving generated OpenAPI docs to benefit everyone
PDF
What do developers do when it comes to understanding and using APIs?
PDF
Inclusive, Accessible Tech: Bias-Free Language in Code and Configurations
PDF
Creating API documentation for international communities
PDF
One Developer Portal to Document Them All
PDF
Docs-as-Code: Evolving the API Documentation Experience
PDF
Developer journey - make it easy for devs to love your product
PPTX
Complexity is not complicatedness
PDF
How cognitive biases and ranking can foster an ineffective architecture and d...
PDF
APIs: Semi-permeable, osmotic interfaces
By the time they're reading the docs, it's already too late
Success metrics when launching your first developer portal
Documentation, APIs & AI
Making sense of analytics for documentation pages
Feedback cycles and their role in improving overall developer experiences
GraphQL Isn't An Excuse To Stop Writing Docs
API Documentation For Web3
Why your API doesn’t solve my problem: A use case-driven API design
unREST among the docs
Developing a best-in-class deprecation policy for your APIs
Annotate, Automate & Educate: Driving generated OpenAPI docs to benefit everyone
What do developers do when it comes to understanding and using APIs?
Inclusive, Accessible Tech: Bias-Free Language in Code and Configurations
Creating API documentation for international communities
One Developer Portal to Document Them All
Docs-as-Code: Evolving the API Documentation Experience
Developer journey - make it easy for devs to love your product
Complexity is not complicatedness
How cognitive biases and ranking can foster an ineffective architecture and d...
APIs: Semi-permeable, osmotic interfaces
Ad

Recently uploaded (20)

PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PPTX
A Presentation on Artificial Intelligence
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
Encapsulation_ Review paper, used for researhc scholars
PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PPT
Teaching material agriculture food technology
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PDF
Mobile App Security Testing_ A Comprehensive Guide.pdf
PDF
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
PPTX
Tartificialntelligence_presentation.pptx
PDF
Spectral efficient network and resource selection model in 5G networks
PPTX
Programs and apps: productivity, graphics, security and other tools
PPTX
Machine Learning_overview_presentation.pptx
PPTX
Group 1 Presentation -Planning and Decision Making .pptx
PDF
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
PPTX
1. Introduction to Computer Programming.pptx
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
PDF
Approach and Philosophy of On baking technology
PDF
Getting Started with Data Integration: FME Form 101
PPTX
Big Data Technologies - Introduction.pptx
The Rise and Fall of 3GPP – Time for a Sabbatical?
A Presentation on Artificial Intelligence
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
Encapsulation_ Review paper, used for researhc scholars
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Teaching material agriculture food technology
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Mobile App Security Testing_ A Comprehensive Guide.pdf
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
Tartificialntelligence_presentation.pptx
Spectral efficient network and resource selection model in 5G networks
Programs and apps: productivity, graphics, security and other tools
Machine Learning_overview_presentation.pptx
Group 1 Presentation -Planning and Decision Making .pptx
Optimiser vos workloads AI/ML sur Amazon EC2 et AWS Graviton
1. Introduction to Computer Programming.pptx
Reach Out and Touch Someone: Haptics and Empathic Computing
Approach and Philosophy of On baking technology
Getting Started with Data Integration: FME Form 101
Big Data Technologies - Introduction.pptx

Optimizing Dev Portals with Analytics and Feedback

  • 1. Optimizing Dev Portals with Analytics and Feedback James Noes - PM
  • 2. 2 2 To optimize a product with analytics and feedback, we must follow practices that emphasis the importance of analytics and feedback.
  • 3. 3 Overview • Unique challenge of opportunity for Dev Portals • Side effects of having too many opportunities • How might we use experiments to help us maintain good practices and better optimize the experience
  • 4. 4 Many Opportunities Partner Integrations Accounts Blog Catalog Newsletter Documentation FAQ Analytics What’s New Try it Now Security Community Notifications Support Events Search Feedback API Management Forums Reporting Use Cases Success Stories Terms of Service Payments RBAC Governance Health & Status About Us Standards SDKs Tutorials Policies Getting Started Dev Tools Social Media Release Notes Roadmap Pricing Administration Videos Scalability Reliability & Performance Developer Experience Product Pages API Access Test Automation Frameworks Dependencies Database Management Tech Debt Insights Best Practices Design Open Source Monitoring & Alerting Infrastructure Accessibility API Reference Careers Marketing News & Press Testing Monetization
  • 5. 5 Prioritization Partner Integrations Accounts Blog Catalog Newsletter Documentation FAQ Analytics What’s New Try it Now Security Community Notifications Support Events Search Feedback API Management Forums Reporting Use Cases Success Stories Terms of Service Payments RBAC Governance Health & Status About Us Standards SDKs Tutorials Policies Getting Started Dev Tools Social Media Release Notes Roadmap Pricing Administration Videos Scalability Reliability & Performance Developer Experience Product Pages API Access Test Automation Frameworks Dependencies Database Management Tech Debt Insights Best Practices Design Open Source Monitoring & Alerting Infrastructure Accessibility API Reference Careers Marketing News & Press Now Next & Later Testing Monetization
  • 6. 6 Side effects of many opportunities • Creating a detailed long-term roadmap • The habit of “checking things off the list” • Not questioning return on investment • Building to meet stakeholder requirements • Focusing on outputs, not outcomes • Less emphasis on analytics and feedback
  • 7. 7 Traditional vs. Modern Product Development Product Development (Modern) Feature Factory (Traditional)
  • 8. 8 How might we avoid it?
  • 9. 9 What's the difference? Foundation & Features Experiments • Minimal uncertainty • Starts with “We know by doing.. We will..” • May take weeks, months, or even years • Typically checked off “the list” because they have less uncertainty • Only revisited when impacted by another feature • Moderate-high uncertainty • Starts with “We believe by doing.. We expect.. • Timeboxed to emphasis return on investment • Helps us refine “the list” with data • Structured to emphasis assessing outcomes, learning, and iteration
  • 10. 10 Example Experiment Hypothesis We believe that by providing use cases for our API Products, that we will increase in active apps (consumer). Scope/Requirements • Limit development and testing to 1 sprint • Experiment will last 2 months (no changes to use cases) • No automated testing required • Accessible in top 20% of homepage • Use cases must have CTAs Key Measures • Number of active apps • Unique Views • Time on Page • CTA Engagement Success Criteria • 6% increase in active apps over previous 2 months • 25% of new users engage with use cases • Time on page = at least 70% of read time • 30% of users click CTA
  • 11. 11 Traditional vs. Modern Product Development Product Development (Modern) Feature Factory (Traditional)
  • 12. 12 Assessing Results Success Criteria • 6% increase in active apps over previous 2 months • 25% of new users engage with use cases • Time on page = at least 70% of read time • 30% of users click CTA Scenario 1 • 8% increase in active apps • 60% of new users engage with use cases • Time on page = 90% • 50% of users click CTA Experiment was successful – it is obvious that use cases contributed to outcome. Scenario 2 • 1% increase in active apps • 8% of new users engage with use cases • Time on page = 20% • 5% of users click CTA Experiment was unsuccessful – unlikely that will have an impact. Should discuss removing use from application to avoid a cluttered experience. Scenario 3 • 3% increase in active apps • 10% of new users engage with use cases • Time on page = 100% • 80% of users click CTA Experiment did not achieve expected results - that use case could be effective. Consider experiment with placement.
  • 13. 13 Recommendations • Determine how much of your time you can afford to spend on experimentation (20% or less to start) • Bring the experimental mindset back into new feature development • Always have multiple key measures (3-4 is typically best) • Focus on experiments as a learning opportunity • Embrace learning – avoid output as a measure of success • Unsuccessful experiments are a success if you minimize investment and learn