[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference vmsdev::alpha_verification

Title:Alpha Verification conference
Notice:QTV Bug Chasers! Practicing the fine art of defect detection, reporting and reproduction...
Moderator:STAR::JFRAZIER
Created:Wed Mar 20 1991
Last Modified:Tue Jun 03 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:487
Total number of notes:2459

484.0. "QTV Gryphon (OpenVMS V7.1) Retrospective (any follow-ons...)" by STAR::FREYBERGER () Wed Mar 12 1997 13:36

From:	EVMS::STAR::AWARNER "Anne Warner ~ (603) 881-6376 ~ dtn: 381-6376 ~ ZK3-4/S23  21-Feb-1997 1423 -0500"
21-FEB-1997 14:25:04.82
To:	@QTV_RETRO
CC:	
Subj:	I: QTV Grypon Retrospective - Final Report

			QTV Gryphon Retrospective
				Final Report


Retrospective Facilitator/Report Author: Anne Warner

I. Overview

On Tuesday, February 11, 1997, members of the QTV Gryphon team met to do
a retrospective of their experiences with the Gryphon Release.  Participants 
brainstormed around Gryphon successes, things they liked, and learnings -
and around things in Gryphon that they didn't like, that that didn't work,
and things they would like to change.  

After all ideas were on the table, or in this case the board, people picked 
out the key ideas that they believe had the biggest impact on the release and 
that they would like to either change or ensure are kept.  These key ideas 
were discussed for more in-depth understanding and either assigned owners and 
actions or classified as "systemic".  Systemic issues are those that cannot be 
directly impacted by QTV.  They have been grouped with systemic issues from 
other retrospectives and will be analyzed and presented at Linda Benson's 
staff.  The QTV actionable ideas are listed in this document.  These actions 
will be tracked by the QTV staff until they are closed.

In summary, the positives from Gryphon showed that the QTV processes worked
fairly well.  Regression and Matrix testing got good marks.  Status
meetings and reports were seen as having value.  Positive relationships with
other key groups were also discussed.  QTV had a good relationship and 
felt listened to by the Gryphon Release Team.  Once development groups saw
the contribution that QTV provided, many QTV representatives acted as 
valued members of the development projects.  This was seen as a win-win 
situation for the development group and QTV.

Key things for improvement include further enhancement of test and tools 
maintenance.  Specifically in this area, Layered Product testing needs to
be defined and planned.  There were questions around how QTV can promote the
value beyond testing that they add.  It was determined that QTV hardware 
needs to be better tracked so that people know where it is and who has it.  
There were also many systemic issues raised around how QTV's work is impacted 
by changes in project and release schedule changes.

The next section contains the Key Gryphon Issues with associated actions or 
learnings.  Section III contains the Retrospective Details.

II. Key Gryphon Issues and Actions

QTV produces a lot of statistics that it doesn't utilize around crash rates.

QTV1:	Action: Create a note requesting data on crashes in the Alpha 
	Verification notes conference.  Notify QTV about the note's existence
	DRI: Tim Beaudin
	DUE: 18-Feb
	Status: complete 19-Feb

Status reports are useful but not written consistently and may be cumbersome
for both the writer and receiver

QTV2:	Action: Review status report template as part of QTV Communication 
	Team work
	DRI: Chris Shaver
	Status: Plan by 19-Feb

There is still room for improvement in test/tool maintenance. (VAX 5 and 
Alpha 11 unanalyzed, unanswered QARS)

QTV3:	Action: Poll QTV to understand issues in more detail.  Determine
	if any future action is necessary and/or feasible
	DRI: Mick Konrad
	DUE: 21-Feb
	Status: mail sent to poll group 19-Feb, request for update sent 2/21

How do QTV goals gain organization wide respect?  QTV could help promote itself
by providing value added data to projects.

QTV4:	Action: Determine how much work it would be to collect value added
	data on QTV added value
	DRI: Ed Maher
	Due: 7-Mar
	Status: Date revised from 21-Feb to 7-Mar on 20-Feb

There are QTV hardware issues, specifically where it is and who has it.
Also there are no dedicated matrix testing machines.

QTV5:	Action: This is already being worked.  Action is to provide an update
	to QTV
	DRI: Rich Jancsy
	Status: Update provided by 21-Feb

It would be very helpful to all of QTV to know what is happening with the 
release schedule at all times.

QTV6:	Action: Talk with Brian Allison about the Raven specific communication
	channels
	DRI: Mick Konrad
	Status: update by 21-Feb

Layered Product Testing needs to be better defined and planned

QTV7:	Action: Revise or define a repeatable process for LP testing 
	DRI: Mick Konrad
	Status: Plan defined by 21-Feb

QTV8:	Action: Determine if Layered Product Testing can be automated
	DRI: Joe Mahan
	Status: recommendation due by 25-Feb

Statement of Quality timing needs to be changed.  

QTV9:	Action: This issue is already being addressed
	DRI: Ed Maher
	DUE: Change proposal due by 3-Mar
	Status: on Track

Validation plans seemed too detailed for the value they provided (eg. purpose,
reliability, and quality factors sections).  

QTV10:	Action: Determine if QTV project specific Validation Plan template needs
	updating
	DRI: Curt Spacht
	DUE: 11-Mar (Answer and plan, if appropriate)
	Status: in progress (mail sent 17-feb)

Transition criteria was given more visibility in FT2 -> SSB.  It was given
Base OS sponsorship and had someone keeping it visible

QTV11:	Action: Reference following action from Release Team retrospective
	RM3: Early doneness criteria definition allowed clear consideration to 
	what it be (as opposed to defining it as it was used).  This will be 
	defined for inclusion in Raven project plan.
	DRI: Ed Maher (working with Brian Allison) 
	Due: April, 1997

Systemic Issues: It was determined that no QTV action could impact the following
issues, however they they had a significant impact on QTV's productivity.

QTV12:	Action: Present Systemic issues from QTV, Release Team, and 
	Documentation retrospectives to Linda Benson's staff
	DRI: Anne Warner
	Status: by 31-Mar

	- Not enough time between pressing of CD's and release for sufficient
	matrix testing

	- Code freeze?  What code freeze?
	
	- Code freeze transition criteria ignored (especially FT1 and before)

	- Information on how close projects were to code freeze was not
	accurate

	- Project unpredictability causes condensing of test in QTV.  Impacts
	completeness of test

	- Not all P0s and P1's fit P0/P1 definition

	- Some projects expectations about testing are mismatched with
	QTV's expectations about what they will test

	- Communication between release team and QTV regarding schedules
	could be better early in the release cycle.

	- It is a win-win outcome when development groups work in partnership 
	with QTV.  How can this be promoted with all development groups.
	How do QTV goals gain organization wide respect? 

Significant learnings:

QTV Project Leader Release Centric Status meetings were useful

	Learning: Have and publish agendas for meetings.  Keep them on track
	and as brief as possible and have them as necessary (not just because
	it is scheduled)

Regression testing worked well.  

	Learning: Have same person do regression testing throughout the
	entire release and have a defined process

Matrix testing: 
	- Having planned time for matrix testing worked (although still
	intrusive in Qual Cycle
	- Having matrix testing more closely integrated with Qual (FT2->)
	worked 

	Learning: QTV Release and Matrix Test managers need to work together
	early and often



III Retrospective Details

This section contains all of the ideas that were presented at the retrospective.
It includes those that were considered key and highlighted in the previous
sections and other ideas and issues that were not considered for further
action.

III.A  Things that need improving
---------------------------------

Statement of Quality timing needs to be changed.  Ed Maher is already working
on this.
	DRI: Ed Maher
	Status: Change proposal due by 3-Mar

There is still room for improvement in test/tool maintenance. (VAX 5 and 
Alpha 11 unanalyzed, unanswered QARS)
	Action: Poll QTV to understand issues in more detail.  Determine
	if any future action is necessary and/or feasible
	DRI: Mick Konrad
	Status: update by 21-Feb

DECnet Plus not tested on VMS until QTV got it.  This is one of a number of
DECnet Plus issues that were raised.  Most were seen as not likely to repeat
in future releases.  This one, however, was indicative of a larger problem
of mismatched expectations between projects and QTV.  It has been classified
as SYSTEMIC and no QTV specific action was proposed.

Layered Product Testing needs to be defined and planned
	- There is no cookbook for LP testing (is especially difficult when
	several people are working on it).
	- LP testing was not planned for until September 1996.  When it happened
	it immediately became a priority.

	Action: Revise or define a repeatable process for LP testing 
	DRI: Mick Konrad
	Status: Plan defined by 21-Feb

	Action: Determine if Layered Product Testing can be automated
	DRI: Joe Mahan
	Status: Plan due by 25-Feb

QTV produces a lot of statistics that is doesn't utilize around crash rates.
	Action: Create a note requesting data on crashes in the Alpha 
	Verification notes conference.  Notify QTV about the note's existence
	DRI: Tim Beaudin
	Status: complete by 18-Feb

There are QTV hardware issues, specifically where it is and who has it.
Also there are no dedicated matrix testing machines.
	Action: This is already being worked.  Action is to provide an update
	to QTV
	DRI: Rich Jancsy
	Status: Update provided by 21-Feb

Validation plans seemed too detailed for the value they provided (eg. purpose,
reliability, and quality factors sections).  

	Action: Determine if QTV project specific Validation Plan template needs
	updating
	DRI: Curt Spacht
	Status: Answer and plan, if appropriate, by 11-Mar

Systemic Issues: It was determined that no QTV action could impact the following
issues, however they they had a significant impact on QTV's productivity.

	Action: Present Systemic issues from QTV, Release Team, and 
	Documentation retrospectives to Linda Benson's staff
	DRI: Anne Warner
	Status: by 31-Mar

	- Not enough time between pressing of CD's and release for sufficient
	matrix testing

	- Code freeze?  What code freeze?
	
	- Code freeze transition criteria ignored (especially FT1 and before)

	- Information on how close projects were to code freeze was not
	accurate

	- Project unpredictability causes condensing of test in QTV.  Impacts
	completeness of test

	- Not all P0s and P1's fit P0/P1 definition

People are unaware of what is happening particularly in the early stages of
the release with regard to schedule.  This is a communication with the
release manager/team.  It would be very helpful to all of QTV to know what
is happening with the release schedule at all times.

	Action: Anne include with systemic issues presented to Linda Benson's
	staff

	Action: Talk with Brian Allison about the Raven specific communication
	channels
	DRI: Mick Konrad
	Status: update by 21-Feb


Other issues that were discussed but not called out for further action
Note that some are similar to issues with actions.

- There was confusion around QTV priorities between Eagle/Theta, Hardware
releases and Gryphon when Gryphon was first started.

- The system validation cluster was unstable until SSB

- Gryphon development projects were never solidified

- Gryphon planning was tough - it was unclear whether projects were in
E/T or Gryphon for a while.

- There was a perception that lots of Gryphon bugs went out because of 
DECnet Plus problems

- DECnet Plus (Phase V) training was not helpful.  It was not Just-in-time,
was at an overview level, did not have follow-up, and a demo would have
been useful.

- QTV working relationship with DECnet Plus group wasn't good

- DECnet Plus group expected QTV to test - QTV did not expect to test DECnet
Plus

- DECnet changes did not stop at code freeze.  Changes kept coming; not just
fixes but new functionality

- More work was taken on with Layered Product testing and nothing was
taken away.  There were also no requirements on what to test.

- Day to day release slips (instead of slipping to a new scheduled date) made 
it hard for QTV to plan 

- FT1 was painful

- Projects that didn't do a good job on initial scheduling were not replanning

- Requirements around error introduction rate changed in the heat of battle

- No P0's hit code freeze in Gryphon

- Everyone is fighting fires which impacts code freeze

- Being responsible for the test of multiple projects when one is in Qual
is difficult to manage

- The person logging the QAR should close the QAR once they target test it

- The system validation cluster was overloaded


III.B  Things that went well
----------------------------

Status reports:
	- Status reports keep people honest (for themselves) and allow them to 
	keep track of what they do for reviews
	- A project report and a status report are one too many report
	- Weekly status reports were inconsistent from a project leader
	status

	Action: Review status report template as part of QTV Communication 
	Team work
	DRI: Chris Shaver
	Status: Plan by 19-Feb

QTV Project Leader Release Centric Status meetings:
	- Weekly status meetings were helpful - shared info
	- Project meetings went well, taken as 'part of team'

	Learning: Have and publish agendas for meetings.  Keep them on track
	and as brief as possible and have them as necessary (not just because
	it is scheduled)

Transition criteria was given more visibility in FT2 -> SSB.  It was given
Base OS sponsorship and had someone keeping it visible

	Action: Reference following action from Release Team retrospective
	RM3: Early doneness criteria definition allowed clear consideration to 
	what it be (as opposed to defining it as it was used).  This will be 
	defined for inclusion in Raven project plan.
	DRI: Ed Maher (working with Brian Allison) 
	Due: April, 1997

Regression testing worked well.  

	Learning: Have same person do regression testing throughout the
	entire release and have a defined process

Working with development groups - How do QTV goals gain organization wide 
respect?  Part of this is a systemic issue.  QTV could help promote itself
by providing value added data to projects.
	- It is helpful when development groups work in partnership with QTV
	- Working with development for duration of release (planning, code
	reviews, testing, etc.) works
	- Development groups start to see QTV value when they see the defects
	- Validation plans were not reviewed in detail by developers

	Action: Determine how much work it would be to collect value added
	data on QTV added value
	DRI: Ed Maher
	Status: Due 21-Feb

Matrix testing: 
	- Having planned time for matrix testing worked (although still
	intrusive in Qual Cycle
	- Having matrix testing more closely integrated with Qual (FT2->)
	worked 

	Learning: QTV Release and Matrix Test managers need to work together
	early and often

Release managers listened to QTV (no specific action defined)

Having a project leader with a good technical background and ability to
answer questions helped (no specific action defined)

It shipped!


Other issues that were discussed but not called out for further action
Note that some are similar to issues with actions.

- Shadowing 96 and Mount 96 projects

- QTV designated 1 person to deal with DECnet Plus problems so it wasn't
spread out over everyone

- Ability to learn new areas

- "Unloading" work from system validation cluster to allow focus

- Early testing of layered products worked in some cases
	
T.RTitleUserPersonal
Name
DateLines