Cast 2007: Lee Copeland Keynote

(This is based on my notes for the keynote.  We've been told we'll get Lee's slides and when they do I'll review them and, if appropriate, update this page.)

Lee Copeland's keynote kicked off CAST 2007.  Lee started off by talking about some things he believes about testing:

  • Testing exists to provide services to the development team.  They are the ones that need to think about what can go wrong.
  • No best practices - only good practices within a particular context or contexts.
  • Metrics that are not valid are dangerous.
  • All Oracles are fallible.
  • Automated human testing is nonsensical.
  • You won't find all the bugs if you only test one way - different types of defect require different types of tests to find them.
  • You shouldn't crate documentation for its own sake - if you spend time to create a document make sure you know why you are doing it.  Possible valid reasons for creating documents are in order to remember something or in order to communicate something.

He then went on to talk about what he considers innovations in testing that are currently going on:

  • Testing Specialties
    In the past, testers didn't specialize.  Nowadays they do.  There are many testing specialties today: performance testing, security testing, and so on.
  • Test-First Development
    There are benefits.  Prevention of creating an inventory of things that don't work in your program.  Capturing requirements in test cases.  Lee also mentioned an important question that Anthony Marcano (Creator of the web site) asked at Star West: Why wasn't test-driven development driven by the testing community?
  • New Testing Books
    There are many new testing books (of varying quality) coming out every week.  Lee mentioned (and Cem Kaner also mentioned during the conference) that the classic software testing book, Testing Computer Software will be updated in about a year.  Lee also asked the question: "Do testers read about their profession?".  In a recent test process assessment it was discovered that many testers hadn't even read one book on software testing.  Think about if you were about to have surgery and asked the doctor if he ever read even one book on surgery and they said no.  How comfortable would you be letting that doctor operate on you?
  • Open Source Tools
    There are many open source testing tools available.  All testers should know what's out there.
  • Exploratory Testing
    There are now engineering methods growing up around exploratory testing - Session Based Test Management, for example.  It's legitimate and important to realize that you can test in a way where you learn as you go along.  Suggestion talked about: read Whittaker's "How to Break Software", force every error message.
  • Testing Workshops
    These are participatory.  They help testers break out of the mentality that says, "It's up to you to train me."  Examples mentioned: AWT, LEWT, LAWST, WHET, WOCT, WOPR, WTST.
  • Ceritification
    Lee likes the idea of ceritification but isn't happy with any of the existing certification programs for testers that he currently knows about.  Questions to ask about certification:
    • What body of knowledge are they certifying against?
    • What does passing the exam mean?  (For this question, he says that right now what it means is that you've passed the certification test - nothing more.)

It was clear from this keynote and other sessions during the conference that certification is a "hot topic" in the testing community right now.

Lee then talked about his own personal history: putting the 5th node onto the ARPANET (later to become the internet), being in school with Alan Kay and John Warnock.  He talked about innovation and then invited the audience to contribute what they think future testing innovations are.  Some responses:

  • Use of collaborative groups ("Group Genius" - Keith Sawyer)
  • Model-Based Testing (Lee almost had this on his earlier list)
  • Testers deciding what is considered right - not just making sure the product matches the spec but also making sure the spec is right.
  • Fluidity (agility)
  • The tension between certification and context - is the software testing profession mature enough for certification?
  • Study of testing history.  For example, the historical paper describing the waterfall development methodology doesn't say what most people think it says.
  • Toolsmithing
  • Ethnomethodology (the study of how people make sense of the world)
  • High-volume test automation
  • Software testing in the academic world
  • Prediction techniques for software problems

I thought this was an excellent keynote for starting off the conference.  It helped people think about the current and future state of testing which, in my opinion, is a good frame of mind to be in at a conference where I hoped I would be getting exposed to new ideas about software testing from outside the walls of my current job.