SlideShare a Scribd company logo
Slide 1Logo design: Sarah Cole Design
SQGNE is made possible by the support of our sponsors:
Oracle and
Sun Integrated Systems
Dec 2014
 All-volunteer non-profit with no membership dues!
 Supported entirely by our sponsors…
 Over 1,275 members on LinkedIn, 875 Constant Contact
 Monthly meetings - Sept to July on 2nd Wed of month
 SQGNE Web site: www.sqgne.org
Dec 2014
Officers / Hosts / Mission
Mission
 To promote use of engineering and management techniques that lead to delivery of high
quality software
 To disseminate concepts and techniques related to software quality engineering and software
engineering process
 To provide a forum for discussion of concepts and techniques related to software quality
engineering and the software engineering process
 To provide networking opportunities for software quality professionals
Dec 2014
Current Officers:
 John Pustaver – Founder
 Steve Rakitin – President
 Stan Wrobel – Vice President
 Barbara Wioncek – Treasurer
 David Sullivan– Clerk
Our gracious Hosts:
 Paul Ratty
 Tom Arakel
 Marge Shinkle
 Jack Guilderson
At-large Directors:
 Robin Goldsmith
 Marge Shinkle
 Jim Turner
SQGNE 2014-15 Schedule
Speaker Affiliation Date Topic
Jim Parson
Nick Olivo
Checkpoint
Tech.
SmartBear
Sept 10
Test Tool Bake-off featuring HP WebInspect Checkpoint
Tech. and Test Complete
Harish Narayan Oct 8
Leveraging lean systems development and testing to
maximize business value
Michael Mah QSM Associates Nov 12
Passion: What Software Teams Might Learn About Creating
Lasting Change from Eco-pirates
Peter Varhol
Gerie Owen
Dec 10 How do we fix testing?
Stan Wrobel
Mass. Dept of
Education
Jan 14
Agile Development and the changing roles for
Business Analysts and QA
Robin Goldsmith
GoPro
Management
Feb 11 Is Exploratory Testing Really Better?
Mario Moreira Mar 11 Agile – Bringing Testing to the Forefront
Andrew Ambroise Apr 8 Shifting Quality to the Left; Negotiating Cleaner Results
Gerie Owen NSTAR May 13 Testing the Internet of Things: Human Experience Testing
David Heimann June 10
IEEE Standard 730-2014 Software Quality Assurance
Processes
Annual Election of Officers
Everyone July 8 Annual Hot Topics Night…
Dec 2014
Tonight’s Topic
How do we fix testing?
Peter Varhol and Gerie Owen
Perhaps in no other professional field is the dichotomy between theory and practice more starkly different than in the
realm of software testing. Researchers and thought leaders claim that testing requires a high level of cognitive and
interpersonal skills, in order to make judgments about the ability of software to fulfill its operational goals. In their minds,
testing is about assessing and communicating the risks involved in deploying software in a specific state.
However, in many organizations, testing remains a necessary evil, and a cost to drive down as much as possible. Testing
is merely a measure of conformance to requirements, without regard to the quality of requirements or how conformance
is measured. This is certainly an important measure, but tells an incomplete story about the value of software in support
of our business goals.
We as testers often help to perpetuate the status quo. Although in many cases we realize we can add far more value than
we do, we continue to perform testing in a manner that reduces our value in the software development process.
This presentation looks at the state of the art as well of the state of common practice, and attempts to provide a rationale
and roadmap whereby the practice of testing can be made more exciting and stimulating to the testing professional, as
well as more valuable to the product and the organization.
Bios
Peter Varhol is a well-known writer and speaker on software and technology topics, having authored dozens of articles
and spoken at a number of industry conferences and webcasts. He has advanced degrees in computer science, applied
mathematics, and psychology, and is currently Principal at Technology Strategy Research. His past roles include
technology journalist, software product manager, software developer, and university professor.
Gerie Owen is a Test Architect who specializes in developing and managing test teams. She has implemented various
Quality Assurance methodology models, and has developed, trained and mentored new teams from their inception. Gerie
manages large, complex projects involving multiple applications, coordinates test teams across multiple time zones and
delivers high quality projects on time and within budget. In her everyday working life, Gerie brings a cohesive team
approach to testing. She has also presented at several conferences and authored articles on testing and quality assurance
at www.qsma.com.
Dec 2014
HOW DO WE FIX TESTING?
WHY WE HAVE A GREAT OPPORTUNITY TO MAKE IMPROVEMENTS TO
OUR CRAFT
THOUGHT LEADER
Peter Varhol
peter@petervarhol.com
International speaker on technology topics
Technology Evangelist and Writer, Product
Manager, University Professor
Someone who cares about getting testing right
PRACTITIONER
Gerie Owen
gerie@gerieowen.com
Quality Assurance Consultant
Speaker and Writer on Testing topics
Experienced Tester, Test Lead, & Test Architect
Marathon Runner & Running Coach
Meet Our
Cats...
Max,
Stephanie
& Monica
AGENDA
• What we do
• Why it’s not right
• What thought leaders say we should do
• What we need to do differently
• How we get there
• Summary and conclusions
OUR APPLICATIONS HAVE PROBLEMS
EVEN THE IMPORTANT ONES
WHAT HAPPENED TO TESTING?
• We don’t have time for it
• We have to get to market
• We don’t have funding to test
• We haven’t shown sufficient value
• So we let our users test
• That seems to work
INCREASINGLY THE RESULT IS FAILURE
•And everyone knows it’s going to fail
• But no one wants to say so
•So we are all complicit
• Because we know better
WHAT WE DO
• Write test plans and procedures
• Analyze and deconstruct requirements
• Write test cases
• Execute test cases and compare results to requirements
• Make a determination on whether requirements have been met
WHAT WE DO
• Testing finds bugs
• Bugs are a mismatch between requirements and
implementation
• It’s an investigation with a yea or nay answer
• If only it were that simple
WHAT WE DO
• Classify bugs by priority and severity
• Allowing room for negotiation and compromise
• Provide metrics on test cases executed
• Use test cases and metrics to make decisions on quality
WHAT WE DO
• Many organizations call this Quality Assurance
• But we can’t assure quality
• Quality is the result of lifecycle processes
• And we can’t build in quality
• But we can test and evaluate
• If not quality, then what?
AND WHAT IS QUALITY?
• Is it really a mismatch of requirements and reality?
• That makes several assumptions
• The requirements are accurate and complete
• The requirements are unambiguous
• The requirements are all we care about
• We talk about 100 percent test coverage
• But do we really know?
WHAT THOUGHT LEADERS SAY WE SHOULD
DO
• Think
• Act
• Go beyond the requirements
• Look at the context
• Evaluate and make judgments
• Communicate and advise
AND WHAT IS QUALITY?
• Does testing to requirements enable us to find
• Inaccurate or incomplete requirements?
• Inefficient database calls?
• Memory or object leaks?
• Just plain bad design?
• External or internal
AND WHAT IS QUALITY?
• So we have to do more
• Yes, it is our problem
WHAT THOUGHT LEADERS SAY WE SHOULD
DO
• Testing to requirements is a necessary part of the process
• But . . .
• It’s necessary but not sufficient
RULES OF THE ROAD
• Testing requires understanding the problem domain intimately
• Domain experts must be part of the team
• Testing requires understanding the requirements
• Testing requires thinking like a user
• And being willing to take unpopular stands
AND BY THE WAY
• A lot of testing today is done by users
• Whether they know it or not
• Whether they’re qualified or not
• And they probably have no idea of the purpose or
requirements
• Let alone context
AND THAT’S A PROBLEM
• Software that should be better isn’t
• Often not even tested to requirements
• And many software organizations don’t seem to care
• We should care about that
WHAT SHOULD WE DO DIFFERENTLY?
• Understand our own Mindsets
WHAT ARE MINDSETS?
• Developed by psychologist Carol Dweck
• How we mentally approach life and its challenges
• Why brains and talent don’t bring success
• How they can stand in the way of it
• Why praising brains and talent doesn’t foster self-esteem and
accomplishment, but jeopardizes them
WHAT IS A MINDSET
•Broadly categorized in one of two ways
• Fixed
• Growth
IMPLICATIONS OF A FIXED MINDSET
• I did my job right; someone else screwed up
• That’s not really a bug, so I didn’t miss it
• My test cases are complete
• I test to requirements
• You can’t ship this software
IMPLICATIONS OF A GROWTH MINDSET
• I always have more to learn
• I missed that bug, and I understand why
• I can adapt my testing approach to new trends
• I advise decision makers on risks of releasing software
• I don’t ensure quality, whatever my title is
HOW DO WE FIX TESTING?
•Personally Adopt a “Growth Mindset”
•Evangelize Testing within the Context of Your
Organization
YOUR GROWTH MINDSET
• Use your intelligence
• If something seems wrong to you, it probably does to others
• Test Beyond Requirements
• Exploratory Testing
• Experiment with new test approaches
• Field testing for mobile devices
• Study your craft
• Learn about new technologies and apply your knowledge
EVANGELIZE TESTING
• APM is for testers too
• Goes beyond the IT ops people
• Do root cause analysis
• Know why the problem exists
• You’re the expert; let’s hear your opinion
UNDERSTAND YOUR ORGANIZATION
• Depending on your organization
• You may be thanked for going the extra mile
• You may be criticized for exceeding your portfolio
• When you feel burned out, discouraged or bored
• Participate in the professional community
SUMMARY
• Make it personal while remaining professional
• Think beyond the requirements
• Does it feel right?
• Experiment
• Work with development early on
• You are all on the same side
• Make recommendations
THOUGHTS?
• Thank you

More Related Content

What's hot (20)

PDF
Tafline Murnane - The Carrot or The Whip-What Motivates Testers? - EuroSTAR 2010
TEST Huddle
 
PDF
Tutorial on Using Amazon Mechanical Turk (MTurk) for HCI Research
Ed Chi
 
PDF
Agile Implementation Challenges – Testing and more…
SQALab
 
PDF
How DevOps is Transforming IT, and What it Can Do for Academia
Nicole Forsgren
 
PDF
Sciencing the Crap Out of DevOps
Nicole Forsgren
 
PPTX
Rethinking the Role of Testers
Paul Gerrard
 
PPT
Erkki Poyhonen - Software Testing - A Users Guide
TEST Huddle
 
PPTX
Managing The Delivery Of Zero Defect Software
srikanthrao14
 
PDF
[HCMC STC Jan 2015] Choosing The Best Of The Plan-Driven And Agile Developmen...
Ho Chi Minh City Software Testing Club
 
PPTX
Innovation, Lean, Agile. Myths and Misconception
Gaetano Mazzanti
 
PPTX
DevOps Roadtrip - Denver
VictorOps
 
PPTX
2016 velocity santa clara state of dev ops report deck final
Nicole Forsgren
 
PPTX
The Pursuit of Quality - Chasing Tornadoes or Just Hot Air?
Paul Gerrard
 
PDF
DevOps: What's Buried in the Fine Print
Jeffery Smith
 
PPTX
DevOps: The Key to IT Performance
Nicole Forsgren
 
PDF
How To (Not) Open Source - Javazone, Oslo 2014
gdusbabek
 
PPTX
National Science Foundation I-Corps bay area node 2013 update
Stanford University
 
PDF
What I learned from 5 years of sciencing the crap out of DevOps
DevOpsDays DFW
 
PPTX
Using Stories to Test Requirements and Systems
Paul Gerrard
 
PPTX
Continuous Delivery + DevOps = Awesome
Nicole Forsgren
 
Tafline Murnane - The Carrot or The Whip-What Motivates Testers? - EuroSTAR 2010
TEST Huddle
 
Tutorial on Using Amazon Mechanical Turk (MTurk) for HCI Research
Ed Chi
 
Agile Implementation Challenges – Testing and more…
SQALab
 
How DevOps is Transforming IT, and What it Can Do for Academia
Nicole Forsgren
 
Sciencing the Crap Out of DevOps
Nicole Forsgren
 
Rethinking the Role of Testers
Paul Gerrard
 
Erkki Poyhonen - Software Testing - A Users Guide
TEST Huddle
 
Managing The Delivery Of Zero Defect Software
srikanthrao14
 
[HCMC STC Jan 2015] Choosing The Best Of The Plan-Driven And Agile Developmen...
Ho Chi Minh City Software Testing Club
 
Innovation, Lean, Agile. Myths and Misconception
Gaetano Mazzanti
 
DevOps Roadtrip - Denver
VictorOps
 
2016 velocity santa clara state of dev ops report deck final
Nicole Forsgren
 
The Pursuit of Quality - Chasing Tornadoes or Just Hot Air?
Paul Gerrard
 
DevOps: What's Buried in the Fine Print
Jeffery Smith
 
DevOps: The Key to IT Performance
Nicole Forsgren
 
How To (Not) Open Source - Javazone, Oslo 2014
gdusbabek
 
National Science Foundation I-Corps bay area node 2013 update
Stanford University
 
What I learned from 5 years of sciencing the crap out of DevOps
DevOpsDays DFW
 
Using Stories to Test Requirements and Systems
Paul Gerrard
 
Continuous Delivery + DevOps = Awesome
Nicole Forsgren
 

Similar to How do we fix testing (20)

PPTX
John Fodeh - Spend Wisely, Test Well
TEST Huddle
 
PDF
Becoming a software testing expert
moosix
 
PPTX
Istqb implementation
Johnsonstephen Jsstc
 
PPTX
Istqb implementation
Johnsonstephen Jsstc
 
PPTX
Test Strategy-The real silver bullet in testing by Matthew Eakin
QA or the Highway
 
PDF
Rapid software testing
Sachin MK
 
PDF
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
TEST Huddle
 
PPT
Kasper Hanselman - Imagination is More Important Than Knowledge
TEST Huddle
 
PPTX
Introduction to test for non testers
Mattias Lönnqvist
 
PPTX
QA is Broken, Fix it!
Ffrees Family Finance
 
PPT
Software testing lecture 9
Abdul Basit
 
PDF
Rekard Edgren - Curing Our Binary Disease - EuroSTAR 2012
TEST Huddle
 
PPTX
An introduction to Software Testing and Test Management
Anuraj S.L
 
PDF
Fundamentals_of_testing.pdf
AndreeaDavid22
 
PPTX
SWT2_tim.pptx
BnhT27
 
PDF
Do testers have to code... to be useful?
lisacrispin
 
PDF
201008 Software Testing Notes (part 1/2)
Javier Gonzalez-Sanchez
 
PPTX
Full Certified Tester 4 ISTQB Foundational
abrahadawit101
 
PDF
Verification audit
Johan Hoberg
 
PPT
SAM
guest2d10aa
 
John Fodeh - Spend Wisely, Test Well
TEST Huddle
 
Becoming a software testing expert
moosix
 
Istqb implementation
Johnsonstephen Jsstc
 
Istqb implementation
Johnsonstephen Jsstc
 
Test Strategy-The real silver bullet in testing by Matthew Eakin
QA or the Highway
 
Rapid software testing
Sachin MK
 
Isabel Evans - Working Ourselves out of a Job: A Passion For Improvement - Eu...
TEST Huddle
 
Kasper Hanselman - Imagination is More Important Than Knowledge
TEST Huddle
 
Introduction to test for non testers
Mattias Lönnqvist
 
QA is Broken, Fix it!
Ffrees Family Finance
 
Software testing lecture 9
Abdul Basit
 
Rekard Edgren - Curing Our Binary Disease - EuroSTAR 2012
TEST Huddle
 
An introduction to Software Testing and Test Management
Anuraj S.L
 
Fundamentals_of_testing.pdf
AndreeaDavid22
 
SWT2_tim.pptx
BnhT27
 
Do testers have to code... to be useful?
lisacrispin
 
201008 Software Testing Notes (part 1/2)
Javier Gonzalez-Sanchez
 
Full Certified Tester 4 ISTQB Foundational
abrahadawit101
 
Verification audit
Johan Hoberg
 
Ad

More from Peter Varhol (18)

PPTX
Not fair! testing AI bias and organizational values
Peter Varhol
 
PPTX
DevOps and the Impostor Syndrome
Peter Varhol
 
PPTX
Not fair! testing ai bias and organizational values
Peter Varhol
 
PPTX
162 the technologist of the future
Peter Varhol
 
PPTX
Correlation does not mean causation
Peter Varhol
 
PPTX
Digital transformation through devops dod indianapolis
Peter Varhol
 
PPTX
Testing for cognitive bias in ai systems
Peter Varhol
 
PPTX
Making disaster routine
Peter Varhol
 
PPTX
What Aircrews Can Teach Testing Teams
Peter Varhol
 
PPTX
Identifying and measuring testing debt
Peter Varhol
 
PPTX
What aircrews can teach devops teams ignite
Peter Varhol
 
PPTX
Talking to people lightning
Peter Varhol
 
PPTX
Using Machine Learning to Optimize DevOps Practices
Peter Varhol
 
PPTX
Varhol oracle database_firewall_oct2011
Peter Varhol
 
PPTX
Qa test managed_code_varhol
Peter Varhol
 
PPTX
Testing a movingtarget_quest_dynatrace
Peter Varhol
 
PDF
Talking to people: the forgotten DevOps tool
Peter Varhol
 
PPTX
Moneyball peter varhol_starwest2012
Peter Varhol
 
Not fair! testing AI bias and organizational values
Peter Varhol
 
DevOps and the Impostor Syndrome
Peter Varhol
 
Not fair! testing ai bias and organizational values
Peter Varhol
 
162 the technologist of the future
Peter Varhol
 
Correlation does not mean causation
Peter Varhol
 
Digital transformation through devops dod indianapolis
Peter Varhol
 
Testing for cognitive bias in ai systems
Peter Varhol
 
Making disaster routine
Peter Varhol
 
What Aircrews Can Teach Testing Teams
Peter Varhol
 
Identifying and measuring testing debt
Peter Varhol
 
What aircrews can teach devops teams ignite
Peter Varhol
 
Talking to people lightning
Peter Varhol
 
Using Machine Learning to Optimize DevOps Practices
Peter Varhol
 
Varhol oracle database_firewall_oct2011
Peter Varhol
 
Qa test managed_code_varhol
Peter Varhol
 
Testing a movingtarget_quest_dynatrace
Peter Varhol
 
Talking to people: the forgotten DevOps tool
Peter Varhol
 
Moneyball peter varhol_starwest2012
Peter Varhol
 
Ad

Recently uploaded (20)

PDF
Tea4chat - another LLM Project by Kerem Atam
a0m0rajab1
 
PPTX
IT Runs Better with ThousandEyes AI-driven Assurance
ThousandEyes
 
PDF
NewMind AI Weekly Chronicles – July’25, Week III
NewMind AI
 
PPTX
Agile Chennai 18-19 July 2025 Ideathon | AI Powered Microfinance Literacy Gui...
AgileNetwork
 
PPTX
The Future of AI & Machine Learning.pptx
pritsen4700
 
PDF
AI Unleashed - Shaping the Future -Starting Today - AIOUG Yatra 2025 - For Co...
Sandesh Rao
 
PDF
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
PPTX
AI and Robotics for Human Well-being.pptx
JAYMIN SUTHAR
 
PPTX
Simple and concise overview about Quantum computing..pptx
mughal641
 
PDF
RAT Builders - How to Catch Them All [DeepSec 2024]
malmoeb
 
PDF
Responsible AI and AI Ethics - By Sylvester Ebhonu
Sylvester Ebhonu
 
PDF
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
PDF
Brief History of Internet - Early Days of Internet
sutharharshit158
 
PPTX
Dev Dives: Automate, test, and deploy in one place—with Unified Developer Exp...
AndreeaTom
 
PDF
Google I/O Extended 2025 Baku - all ppts
HusseinMalikMammadli
 
PPTX
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
PPTX
Farrell_Programming Logic and Design slides_10e_ch02_PowerPoint.pptx
bashnahara11
 
PDF
Build with AI and GDG Cloud Bydgoszcz- ADK .pdf
jaroslawgajewski1
 
PPTX
What-is-the-World-Wide-Web -- Introduction
tonifi9488
 
PDF
Presentation about Hardware and Software in Computer
snehamodhawadiya
 
Tea4chat - another LLM Project by Kerem Atam
a0m0rajab1
 
IT Runs Better with ThousandEyes AI-driven Assurance
ThousandEyes
 
NewMind AI Weekly Chronicles – July’25, Week III
NewMind AI
 
Agile Chennai 18-19 July 2025 Ideathon | AI Powered Microfinance Literacy Gui...
AgileNetwork
 
The Future of AI & Machine Learning.pptx
pritsen4700
 
AI Unleashed - Shaping the Future -Starting Today - AIOUG Yatra 2025 - For Co...
Sandesh Rao
 
Make GenAI investments go further with the Dell AI Factory
Principled Technologies
 
AI and Robotics for Human Well-being.pptx
JAYMIN SUTHAR
 
Simple and concise overview about Quantum computing..pptx
mughal641
 
RAT Builders - How to Catch Them All [DeepSec 2024]
malmoeb
 
Responsible AI and AI Ethics - By Sylvester Ebhonu
Sylvester Ebhonu
 
OFFOFFBOX™ – A New Era for African Film | Startup Presentation
ambaicciwalkerbrian
 
Brief History of Internet - Early Days of Internet
sutharharshit158
 
Dev Dives: Automate, test, and deploy in one place—with Unified Developer Exp...
AndreeaTom
 
Google I/O Extended 2025 Baku - all ppts
HusseinMalikMammadli
 
Introduction to Flutter by Ayush Desai.pptx
ayushdesai204
 
Farrell_Programming Logic and Design slides_10e_ch02_PowerPoint.pptx
bashnahara11
 
Build with AI and GDG Cloud Bydgoszcz- ADK .pdf
jaroslawgajewski1
 
What-is-the-World-Wide-Web -- Introduction
tonifi9488
 
Presentation about Hardware and Software in Computer
snehamodhawadiya
 

How do we fix testing

  • 1. Slide 1Logo design: Sarah Cole Design SQGNE is made possible by the support of our sponsors: Oracle and Sun Integrated Systems Dec 2014
  • 2.  All-volunteer non-profit with no membership dues!  Supported entirely by our sponsors…  Over 1,275 members on LinkedIn, 875 Constant Contact  Monthly meetings - Sept to July on 2nd Wed of month  SQGNE Web site: www.sqgne.org Dec 2014
  • 3. Officers / Hosts / Mission Mission  To promote use of engineering and management techniques that lead to delivery of high quality software  To disseminate concepts and techniques related to software quality engineering and software engineering process  To provide a forum for discussion of concepts and techniques related to software quality engineering and the software engineering process  To provide networking opportunities for software quality professionals Dec 2014 Current Officers:  John Pustaver – Founder  Steve Rakitin – President  Stan Wrobel – Vice President  Barbara Wioncek – Treasurer  David Sullivan– Clerk Our gracious Hosts:  Paul Ratty  Tom Arakel  Marge Shinkle  Jack Guilderson At-large Directors:  Robin Goldsmith  Marge Shinkle  Jim Turner
  • 4. SQGNE 2014-15 Schedule Speaker Affiliation Date Topic Jim Parson Nick Olivo Checkpoint Tech. SmartBear Sept 10 Test Tool Bake-off featuring HP WebInspect Checkpoint Tech. and Test Complete Harish Narayan Oct 8 Leveraging lean systems development and testing to maximize business value Michael Mah QSM Associates Nov 12 Passion: What Software Teams Might Learn About Creating Lasting Change from Eco-pirates Peter Varhol Gerie Owen Dec 10 How do we fix testing? Stan Wrobel Mass. Dept of Education Jan 14 Agile Development and the changing roles for Business Analysts and QA Robin Goldsmith GoPro Management Feb 11 Is Exploratory Testing Really Better? Mario Moreira Mar 11 Agile – Bringing Testing to the Forefront Andrew Ambroise Apr 8 Shifting Quality to the Left; Negotiating Cleaner Results Gerie Owen NSTAR May 13 Testing the Internet of Things: Human Experience Testing David Heimann June 10 IEEE Standard 730-2014 Software Quality Assurance Processes Annual Election of Officers Everyone July 8 Annual Hot Topics Night… Dec 2014
  • 5. Tonight’s Topic How do we fix testing? Peter Varhol and Gerie Owen Perhaps in no other professional field is the dichotomy between theory and practice more starkly different than in the realm of software testing. Researchers and thought leaders claim that testing requires a high level of cognitive and interpersonal skills, in order to make judgments about the ability of software to fulfill its operational goals. In their minds, testing is about assessing and communicating the risks involved in deploying software in a specific state. However, in many organizations, testing remains a necessary evil, and a cost to drive down as much as possible. Testing is merely a measure of conformance to requirements, without regard to the quality of requirements or how conformance is measured. This is certainly an important measure, but tells an incomplete story about the value of software in support of our business goals. We as testers often help to perpetuate the status quo. Although in many cases we realize we can add far more value than we do, we continue to perform testing in a manner that reduces our value in the software development process. This presentation looks at the state of the art as well of the state of common practice, and attempts to provide a rationale and roadmap whereby the practice of testing can be made more exciting and stimulating to the testing professional, as well as more valuable to the product and the organization. Bios Peter Varhol is a well-known writer and speaker on software and technology topics, having authored dozens of articles and spoken at a number of industry conferences and webcasts. He has advanced degrees in computer science, applied mathematics, and psychology, and is currently Principal at Technology Strategy Research. His past roles include technology journalist, software product manager, software developer, and university professor. Gerie Owen is a Test Architect who specializes in developing and managing test teams. She has implemented various Quality Assurance methodology models, and has developed, trained and mentored new teams from their inception. Gerie manages large, complex projects involving multiple applications, coordinates test teams across multiple time zones and delivers high quality projects on time and within budget. In her everyday working life, Gerie brings a cohesive team approach to testing. She has also presented at several conferences and authored articles on testing and quality assurance at www.qsma.com. Dec 2014
  • 6. HOW DO WE FIX TESTING? WHY WE HAVE A GREAT OPPORTUNITY TO MAKE IMPROVEMENTS TO OUR CRAFT
  • 7. THOUGHT LEADER Peter Varhol peter@petervarhol.com International speaker on technology topics Technology Evangelist and Writer, Product Manager, University Professor Someone who cares about getting testing right
  • 8. PRACTITIONER Gerie Owen gerie@gerieowen.com Quality Assurance Consultant Speaker and Writer on Testing topics Experienced Tester, Test Lead, & Test Architect Marathon Runner & Running Coach
  • 10. AGENDA • What we do • Why it’s not right • What thought leaders say we should do • What we need to do differently • How we get there • Summary and conclusions
  • 13. WHAT HAPPENED TO TESTING? • We don’t have time for it • We have to get to market • We don’t have funding to test • We haven’t shown sufficient value • So we let our users test • That seems to work
  • 14. INCREASINGLY THE RESULT IS FAILURE •And everyone knows it’s going to fail • But no one wants to say so •So we are all complicit • Because we know better
  • 15. WHAT WE DO • Write test plans and procedures • Analyze and deconstruct requirements • Write test cases • Execute test cases and compare results to requirements • Make a determination on whether requirements have been met
  • 16. WHAT WE DO • Testing finds bugs • Bugs are a mismatch between requirements and implementation • It’s an investigation with a yea or nay answer • If only it were that simple
  • 17. WHAT WE DO • Classify bugs by priority and severity • Allowing room for negotiation and compromise • Provide metrics on test cases executed • Use test cases and metrics to make decisions on quality
  • 18. WHAT WE DO • Many organizations call this Quality Assurance • But we can’t assure quality • Quality is the result of lifecycle processes • And we can’t build in quality • But we can test and evaluate • If not quality, then what?
  • 19. AND WHAT IS QUALITY? • Is it really a mismatch of requirements and reality? • That makes several assumptions • The requirements are accurate and complete • The requirements are unambiguous • The requirements are all we care about • We talk about 100 percent test coverage • But do we really know?
  • 20. WHAT THOUGHT LEADERS SAY WE SHOULD DO • Think • Act • Go beyond the requirements • Look at the context • Evaluate and make judgments • Communicate and advise
  • 21. AND WHAT IS QUALITY? • Does testing to requirements enable us to find • Inaccurate or incomplete requirements? • Inefficient database calls? • Memory or object leaks? • Just plain bad design? • External or internal
  • 22. AND WHAT IS QUALITY? • So we have to do more • Yes, it is our problem
  • 23. WHAT THOUGHT LEADERS SAY WE SHOULD DO • Testing to requirements is a necessary part of the process • But . . . • It’s necessary but not sufficient
  • 24. RULES OF THE ROAD • Testing requires understanding the problem domain intimately • Domain experts must be part of the team • Testing requires understanding the requirements • Testing requires thinking like a user • And being willing to take unpopular stands
  • 25. AND BY THE WAY • A lot of testing today is done by users • Whether they know it or not • Whether they’re qualified or not • And they probably have no idea of the purpose or requirements • Let alone context
  • 26. AND THAT’S A PROBLEM • Software that should be better isn’t • Often not even tested to requirements • And many software organizations don’t seem to care • We should care about that
  • 27. WHAT SHOULD WE DO DIFFERENTLY? • Understand our own Mindsets
  • 28. WHAT ARE MINDSETS? • Developed by psychologist Carol Dweck • How we mentally approach life and its challenges • Why brains and talent don’t bring success • How they can stand in the way of it • Why praising brains and talent doesn’t foster self-esteem and accomplishment, but jeopardizes them
  • 29. WHAT IS A MINDSET •Broadly categorized in one of two ways • Fixed • Growth
  • 30. IMPLICATIONS OF A FIXED MINDSET • I did my job right; someone else screwed up • That’s not really a bug, so I didn’t miss it • My test cases are complete • I test to requirements • You can’t ship this software
  • 31. IMPLICATIONS OF A GROWTH MINDSET • I always have more to learn • I missed that bug, and I understand why • I can adapt my testing approach to new trends • I advise decision makers on risks of releasing software • I don’t ensure quality, whatever my title is
  • 32. HOW DO WE FIX TESTING? •Personally Adopt a “Growth Mindset” •Evangelize Testing within the Context of Your Organization
  • 33. YOUR GROWTH MINDSET • Use your intelligence • If something seems wrong to you, it probably does to others • Test Beyond Requirements • Exploratory Testing • Experiment with new test approaches • Field testing for mobile devices • Study your craft • Learn about new technologies and apply your knowledge
  • 34. EVANGELIZE TESTING • APM is for testers too • Goes beyond the IT ops people • Do root cause analysis • Know why the problem exists • You’re the expert; let’s hear your opinion
  • 35. UNDERSTAND YOUR ORGANIZATION • Depending on your organization • You may be thanked for going the extra mile • You may be criticized for exceeding your portfolio • When you feel burned out, discouraged or bored • Participate in the professional community
  • 36. SUMMARY • Make it personal while remaining professional • Think beyond the requirements • Does it feel right? • Experiment • Work with development early on • You are all on the same side • Make recommendations

Editor's Notes

  • #14: This implies that something has changed in the testing realm. But we’ve never had sufficient time or funding. I don’t think it isn’t that we haven’t shown sufficient value. It’s just that when thing go bad, everyone notices; when things go well, that’s the expectation and no one notices. I think there’s something to be said for that. But I think the time pressures are more today, too.
  • #15: Tester’s aren’t necessarily all complicit or complacent. Our role as I have always understood it.  The role of the tester is to provide clear information about the functionality, stability, performance, usability etc of the application under test and to assess and advise on the risk of deploying in its current condition.   Then explain to me the role of testing in healthcare.gov. This should have been the most tested application on the planet. Instead, everyone just shut their eyes and mumbled.
  • #19: Well as we move beyond Agile and into Continuous delivery and continuous deployment, the testers role actually shifts to quality assurance. In continuous delivery, the entire team from including developers, systems administrators and operations must be responsible for testing. Since the tester is the expert, it becomes the testers role to manage the testing process. And that’s what I think I say on this slide – that quality is the result of process.
  • #23: Isn’t this old news? Given all the disruptive trends in testing including cloud, mobile, "human experience", Agile, and continuous delivery, aren't we already testing more than just requirements, sometimes without requirements and sometimes creating requirements.
  • #33: However, you need to understand and evangelize your role in the context of your own organization.  In some organizations, testers are involved testing post production defects and with APM.  In other organizations, the testers are solely pre-production and the business takes over with post production testing.  Some of this is dictated by SOX requirements.   In TCoE's with offshore test teams, the TCoE experts might be involved post production whereas the offshore teams are tasked with pre prod testing only.  Testers be able to compare notes by suggesting a weekly meeting or sharing through a collaboration tool etc.