Review by Phlip: Xtreme Testing Is Elementary, My Dear Watson
Yahoo groups -agile-testing Agile Software Testing 
From: "Phlip" 
Date: Sun Feb 16, 2003  7:33 pm
Subject: Re: Automated Testing - 2 white papers phlip_cpp
Offline Offline
Send Email Send Email
[Netiquette tip: Don't reply a private mail to a
public forum. Suspended: It's all technical stuff.]

Alex Samurin wrote:

> please comment the following:
> It is a view of the black box tester.

I think we discuss this site:

http://www.geocities.com/xtremetesting/

I also suspect that site picked the name "extreme" not
realizing it was taken...

> My opinion on this topic is little different from
> common.
> 1. Define what is considered to be Quality for your
> system and then begin
> to build Xtreme testing process to assure you meet
> the standard you set.

That sounds like a phase. When confronted with a phase
in time, turn it on its side. Do the phase in tiny
increments spread out over time.

Put another way, the longer your project goes, the
better you will understand your project's specific
"Quality" needs are. So the "planning" and
"implementing" are chronic, not acute.

> My suggestion is that quality for Xtreme testing
> project should be defined
> as achieving the target, increasing the testability
> and minimizing the
> variability of the system under the test. The fact
> that minimizing the
> variability of the system under the test increases
> its quality has been
> proved long ago.

That's where XP differs. We encourage "Refactor
Mercilessly". That means we try to vary the system
over time. If we can't (likely), then it got good. If
it can, the design's getting better.

So the thing to minimize is the source code's
resistance to change. The more we change it, the more
flexible everything becomes.

> Designing unit tests first forces
> you to build modular,
> unit testable objects (see the vision page) and is
> only the first step
> for minimizing the system variability and increasing
> testability. Read
> Taguchi if you would like to find more about
> minimizing the variability
> of the system under the test.

"Test first" does not mean "big test up front". It
means you write test and code at the same time, with
the tests very slightly in the lead.

In terms of "design for testing" - the ideal I think
you approach there - we kill to birds with one stone
(design and debugging) using "design BY testing".

> It is possible to
> minimize variability of
> the system during requirements defining;
> architectural design, GUI design
> and coding .

Those are the changes we embrace. The point is to let
the OnsiteCustomer steer. Requirements get defined in
realtime, not up-front.

(Are you starting to notice any themes yet?)

> Minimizing variability during test
> cases design helps to save
> time in a future maintenance of the test suite.
> Remember that:
> * Xterme testing is a major source of quality
> feedback

All feedback. It's the innermost loop that pumps these
outer loops:

http://www.extremeprogramming.org/map/loops.html

> * Xterme testing is a key to customer satisfaction.

Have you cracked an XP book yet? The satisfaction is
in their own hands. We just give them a steering
wheel. If they drive off a cliff, we still get paid
(until the end ;-).

> 2. XP suggests to works in pairs. Are two brains
> better than one? Sure.
> However, is productivity increased by two? Instead
> of working in pairs
> as XP suggests for developers, use rotation team
> (even for this activity
> you need to schedule additional time for switching)
> where a less experienced
> tester starts the cycle. Rotation can be based for
> example on the iteration
> cycles.

The XP model is spontaneous here. We pair
promiscuously, and we put all engineers (AND testers
AND the customer) in one room. This makes
communication almost free, unlike paperwork.

> 3. Xtreme testing embodies communication with focus
> on simplicity: use
> person-to-person communication instead of written
> documents wherever
> possible.
> Cooperation between developers and testers through
> all development life
> cycle is essential.
> One of the rules is: Add document, or process only
> when you really need
> them,
> not in anticipation of need.

Right.

> If you still have to create a document - it should
> contain essentials only.

You grow the document incrementally, getting it
reviewed by the official Person Who Gives a Crap About
It. When it passes its test, you stop.

> * Are you afraid that a tester will leave in the
> middle of the project?
> Schedule an additional time for him to prepare an
> extensive documentation.

Refactoring adds self-documentation, to both the tests
and the code.

> 4. In fact extreme programming/testing provides
> better risk management than
> many other methods.

What's XP's strategy for dealing with risk?

What's a fishes strategy for dealing with water?

> Let us begin Xtreme testing (set of guidelines):
> "The hole is more than the sum of its parts."
> Aristotle, in Metaphysics (384 - 322 B.C.)

Priceless typo there ;-)

> * Define the target.
> * Define the minimum necessary documentation for
> testing that must
> be created.
> * Design test first. ( Try to specify as many test
> cases as possible
> before the code is written.)
> * Let developers write unit tests according to XP.
> Define 100% line
> coverage of code for unit testing as a must. This
> means only that
> developers'
> work is finished, not that code is bugs free.
> * Try to eliminate redundancy of TC for unit and
> system testing.

Not "try". do. Unit Tests on checked-in code run 100%,
all the time.

> * Try to avoid frequent builds for bugs fixes and
> changes after promotion.

You might mean "don't fork the code. If you do, don't
change the release fork to much".

We don't fork the code. Problem solved.

> An every night build is the good practice only
> during development process.

Most programs can build in less than an hour. Do a
total build several times a day.

> Try to avoid (as much as you can) supporting
> different versions of
> applications
> in production..

You mean "don't fork the code". If the Customer asks
for different versions (also called "products" or
"flavors") they get them. One code base behind them.

> * Try to cover with test cases all layers of the
> system. (define TC related
> to
> the system architecture and environment)

There's that "try" again ;-)

> * Try to make all testing processes more robust.
> * Allow working together developers and testers
> (deep integration.)

This is now CMM territory. Look at my recent post to
news:comp.software-eng about this.

> I mean here not pairs of developers or testers but
> developers and testers
> working in one group.
>
> And try to make some innovations like:
> * Using a rotation team.

Rotate spontaneously. Don't bother to control this,
one way or another.

In XP, developers don't own code modules, they own
tasks. The most qualified developer bids on which
tasks they'l take.

> * Implementing a gray box testing.

You might try
http://groups.yahoo.com/group/agile-testing for that
one.

> * Participating of testers on all architecture
> related meetings and
> walkthrough.

"Meeting"? What's a "meeting"??

> * Developers reviewing TC running by testers and
> vice versa.

Uh, they pair.

> * Testers involving in the business analysis and
> requirements review. This
> can
> help to increase testability and minimize the
> variability of the system
> under test.

XP calls the "business analyst" the "CustomerTeam".
The distinction is we don't give a rat's ass what the
title of any "customer" is, only that they are the
customer and we obey them.

> * Use UML model and diagrams for defining TC: This
> is not the spirit of XP,
> but
> from my experience it helps to create better TC.

Research CRC Cards.

Like all BigDesignUpFront, they let you practice and
learn about the space before diving into the code.

But don't waste time on it, don't worship the result,
and don't bother "fixing" the UML if the source should
happen to depart a little. Or a lot.

> Unfortunately, special
> diagrams and notations for testers do not exist yet.

That's a very interesting assertion. You might want to
help fill this niche.

> I believe that easily
> understandable graphical TC is the only right way
> for improvement
> the testing process.
> Class diagram for unit test cases
> Sequence diagram for integration test cases
> Use Case and Activity diagram for system test cases

Yawn. My left brain just shut down (or is it my
right?).

> * Use orthogonal array technique to reduce the
> number of TC.

Oookay. There is a body of test jargon out there, that
I am only starting to learn. Let's call them...

- mock
- fixture
- resource
- others..?

Test code follows style guidelines different from
production code, so they have different design
patterns, too.

> * Next step will be creating a built-in diagnostic
> for any of:
> External system
> Third party libraries
> Database
> Configuration parameters
> Built-in verification and calculation of a control
> example for the
> end-to-end scenario - is a good decision for some
> systems.

I whack all test code when I compile for RELEASE mode.

You might be implying the topic "release mode testing"
here.

>
> Allow developers to concentrate on:
> * Developing new and running existing unit test
> cases.

Yes. XP partitions labors admirably here.

> * Applying analysis tools such as code coverage
> directly into your JUnit
> tests

No! That's QA's job. Engineers test, code, and
refactor. Period.

http://www.greencheese.org/DontPlanDesigns

> * Generating automatically unit tests from class
> definitions and UML models.

Noope. If we can generate them automatically, then the
source we generate from is >source<.

We test code and refactor >source<.

If this generator can't handle us refactoring it,
round trip, we may run it once to get started, but
then we boot it out of the loop. Like the VC++ Class
Wizard.

> Allow tester (i.e. black box tester) to concentrate
> on:
> * Integration testing of the system.
> * Business cycle testing (system behavior in certain
> time frames).
> * Concurrency testing.
> * Integration testing of any third party components.
> * Specification based testing, scenario testing.
> * Database testing using CRUD matrix. (gray box
> testing method)
> * Other testing types like: load, performance,
> installation, volume, user
> documentation, disaster recovery etc.

Absolutely. They go straight to the hardest part of
their job, and get good at it.

In conclusion, your heart's in the right place, but
you gotta crack a book. Start with any XP tutorial,
and with /Testing Extreme Programming/.

--
Phlip
http://www.c2.com/cgi/wiki?SheChangeDesignInTheDatabase







Software Testing Main Page
© 2003 Alex Samurin http://www.geocities.com/xtremetesting/