SQS the worlds worlds leading specialist in software

  • Slides: 28
Download presentation
SQS – the world’s world‘s leading specialist in software quality Please copy a slide

SQS – the world’s world‘s leading specialist in software quality Please copy a slide with a suitable picture from the file „Title Slides_EN. pptx“ (change to presentation mode to download) and paste it here. If necessary, apply the correct formatting by right-clicking and choosing “Layout Title Slide“. Building a lean, mean, BDD automation machine Tom Roden SQS September 2013

Safia - building an executable documentation system Born out of necessity Results Then did

Safia - building an executable documentation system Born out of necessity Results Then did it again… • Large hedge fund COTS implementation • Crippling regression test burden • 6 -8 month cycle daily release • 80 times defect reduction • and again • Same patterns automation platform

The problem quantified – Real data from the field Time taken for Release Regression

The problem quantified – Real data from the field Time taken for Release Regression testing 120 700 601 600 100 545 78 67, 5 60 311 400 393 371 300 40 217 40 284 200 151 106 20 0 64 28 0 23 15 25 5 0 February April 100 12 August November February April May June 0 3 July 0 August Month Man Days Effort for Regression Test # Test Suites # Automated Suites # Test Suites Man Days Effort 500 479 80

The confidence problem! Defects Project Team New Work Sprint 1 Sprint 2 Sprint 3

The confidence problem! Defects Project Team New Work Sprint 1 Sprint 2 Sprint 3 Sprint 4 Sprint 5 Sprint 6 UAT Sprint “n”

What it felt like…

What it felt like…

Build integrity in, become acceptance driven… Backlog Ready User Story Specify tests / examples

Build integrity in, become acceptance driven… Backlog Ready User Story Specify tests / examples Automate acceptance tests Develop Demo / Showcase Customer Verification Session Based Testing Stakeholder(s) & BA Developers Testers

Pretty Fast ROI but other effects too… Time taken for Release Regression testing 120

Pretty Fast ROI but other effects too… Time taken for Release Regression testing 120 700 601 600 100 545 78 67, 5 60 311 400 393 371 300 284 40 217 40 200 151 20 64 28 0 0 February 0 April 23 106 15 25 100 12 5 August November February April May June 0 3 July 0 August Month Man Days Effort for Regression Test # Test Suites # Automated Suites # Test Suites Man Days Effort 500 479 80

Enabling increasingly faster release cycles Project Team Sprint 1 Sprint 2 Release 1 Sprint

Enabling increasingly faster release cycles Project Team Sprint 1 Sprint 2 Release 1 Sprint 3 Sprint 4 Sprint 5 Release 2 Sprint 6 Sprint 7 Release 3 Release 4 Sprint n R R

Knowledge and experience to design a framework A set of design patterns • binding

Knowledge and experience to design a framework A set of design patterns • binding tests to investment apps Inject knowledge • from business specialists Vanilla acceptance tests • easy to customise Living documentation • tests as specifications Safia

Illogical Architecture Diagram Fit. Nesse fixtures Maps SAFIA terminology to Safia fixtures system terminology

Illogical Architecture Diagram Fit. Nesse fixtures Maps SAFIA terminology to Safia fixtures system terminology e. g. Calypso, Beauchamps, Summit Maps investment house terminology to SAFIA terminology e. g. House System Adaptor API Trade Capture = Trade Entry SAFIA automates at the server level via the API Trading system

Creating automated tests using Safia Archetypal function Any instrument e. g. Create a trade

Creating automated tests using Safia Archetypal function Any instrument e. g. Create a trade e. g. Equity, FX, Future Check a position Test Archetype Instrument Template “Test Function” IRS template Create a trade Apply Template & Archetype Test Create a valid IRS trade

But it was easy, right……?

But it was easy, right……?

Real data from the field The path to faster release Time taken for Release

Real data from the field The path to faster release Time taken for Release Regression testing 120 700 601 600 100 545 78 67, 5 60 311 400 393 371 300 284 40 217 40 200 151 20 64 28 0 0 February 0 April 23 106 15 25 100 12 5 August November February April May June 0 3 July 0 August Month Man Days Effort for Regression Test # Test Suites # Automated Suites # Test Suites Man Days Effort 500 479 80

What did we learn? Iterate, Iterate Start simple Flexible Sooner rather than later Iterate

What did we learn? Iterate, Iterate Start simple Flexible Sooner rather than later Iterate Maintenance Record Knowledge

Significant effects Improvements in release speed and quality Time Period # Releases Regression Effort

Significant effects Improvements in release speed and quality Time Period # Releases Regression Effort # Live Defects # Severity 1 / 2 Defects Period 1 8 months 1 40 man days 152 42 Period 2 6 months 2 67. 5 - 100 man days 59 11 Period 3 6 months 9 78 – 23 man days 61 4 Period 4 6 months 14 15 – 0 man days 9 1 Period 5 6 months 28 0 man days 2 0

Significant effects Increased productivity by ~350% Graph showing per Sprint Delivered Story Points by

Significant effects Increased productivity by ~350% Graph showing per Sprint Delivered Story Points by 3 Sprint Moving Average 80 70 Story Points 60 50 40 30 20 10 0 1 2 3 4 5 Actual Delivered Story Points 6 Sprint 7 8 9 10 3 Sprint Moving Average Velocity 11 12

Benefits beyond the obvious Increased collaboration Reduced Cost of Ownership Reliable system documentation Benefits

Benefits beyond the obvious Increased collaboration Reduced Cost of Ownership Reliable system documentation Benefits Teaching & coaching aid Anyone can interact with it

~1/3 of the total cost of ownership = system understanding

~1/3 of the total cost of ownership = system understanding

How we wrote the tests Tests as specs, the principles of test design Self-documenting

How we wrote the tests Tests as specs, the principles of test design Self-documenting Based on testable statements About business intentions What makes a good acceptance test? Anyone can understand it A specification not a script Concise and granular

Test Metrics and Measurements Efficiency • Test (feature / suite / pack) run time

Test Metrics and Measurements Efficiency • Test (feature / suite / pack) run time / coverage • The time spent manually ‘assisting’ execution • Amount of waste in (testing) processes Effectiveness • Cost of defects found in live • % defects to throughput (# defects / velocity) • # of defects due to regression issues (as opposed to new functionality) found in live • Test Coverage • Test technical debt Reliability • No of runs passed/failed • Proportion of tests that fail due to non-genuine errors (intermittency)

Test Metrics and Measurements Updatability • Time / Cost to locate a test •

Test Metrics and Measurements Updatability • Time / Cost to locate a test • Time taken to CRUD a test • For a given change: • How long to assess how many tests need updating • How many tests actually need updating • How long does updating them take Readability / Document-ability • How long to understand (all of) the intentions of a test • How long does it take to locate and analyse the reason for a test failure • How many tests are not approved for acceptance decisions • Time spent on release go/no go meetings with stakeholders

To conclude Source - http: //drfunkhowsersprescriptionpad. wordpress. com

To conclude Source - http: //drfunkhowsersprescriptionpad. wordpress. com

Tom Roden Head of Agile Services www. sqs. com SQS Group Limited 7– 11

Tom Roden Head of Agile Services www. sqs. com SQS Group Limited 7– 11 Moorgate London, EC 2 R 6 AF United Kingdom Phone: +44 20 7448 4620 Fax: +44 20 7448 4651 info-uk@sqs. com Thank you for listening. sqs. com m +44 7917 601706 t +44 20 7448 4620 e tom. roden@sqs. com

Pillars of Effective Testing Confidence • Courage • Credible • Rapid • & Regular

Pillars of Effective Testing Confidence • Courage • Credible • Rapid • & Regular Feedback • Track • Record • Safety • Clear • Test • Intentions • Visible • Results • Flexible Test Automation Frameworks • Testability, Data & Dependency Management • Environment, Build & Configuration Management • High Test Coverage

Taxonomy of Testing te st Acceptance Tests t. T es t Sy ste m

Taxonomy of Testing te st Acceptance Tests t. T es t Sy ste m En d-t o -En d( Sy st em Int eg rat ion Te st) Business Facing es t Un it T Co m po n en Technical Tests Technology Team Facing

Definition Of Done - Checklist High Level Definition of Done "Functionality that has been

Definition Of Done - Checklist High Level Definition of Done "Functionality that has been accepted as of suitable quality to be shipped to production" (but doesn't necessarily have to be shipped). Definition of Done Checklist 1. Covered Functional, Performance & DR Criteria 2. Unit Tests 3. Fit. Nesse Acceptance Tests 4. Acceptance Environment 5. Configuration Version Control NOT Included in Definition of Done but part of Sprint Completion Sprint Review / Demo to Product Owner & Business