SASQAG Logo

Seattle Area Software Quality Assurance Group

Join the SASQAG group on Linked in Join the SASQAG group on Linked In 

Past Meetings 2004

     2015· 2014 · 2013 · 2012 · 2011 · 2010 · 2009 · 2008 · 2007 · 2006 · 2005 · 2004 · 2003 · 2002 · 2001 · 2000 · 1999 · 1998

Return on Software:
Maximizing the Return on Your Software Investment
November 2004

More than 70 percent of software projects are late and over budget. About 40 percent of projects end up being net money losers: they cost
more for the organization than they ever return. The reasons behind this poor performance are varied, but can usually be traced to bad
decisions--either on the part of the development team or the customer, or both. Bad decisions on WHAT to develop, or HOW to develop it. Despite the fact that there is so much information available on HOW to develop software, there's remarkably little about WHY to develop it in the first place. Clearly knowing WHY will illuminate decisions on WHAT and HOW. In addition to discussing WHY to develop software, this presentation describes a systematic process for making better decisions about WHAT to develop and HOW to develop it.

Acrobat Slides (1MB)

by Steve Tockey
Principal Consultant at Construx Software. 

Mr. Tockey has worked in the software industry since 1977 as a programmer, analyst, designer, researcher, consultant, and adjunct professor. Steve is the designated corporate representative to the Object Management Group (OMG, the source of the UML).

Steve is the author of Return on Software, a book designed to help software professionals maximize the return on their software investment.

 
A Quick Intro to Model-Based Testing: 45,000 Tests in 45 Minutes or Less
October 2004

Model-based testing overturns many of the accepted ideas about test automation, and it can be both extremely agile and incredibly cheap. Using nothing more than your brain, C# and Notepad, I'll show you how to improve your understanding of your application and create round-the-clock testing that will pound the bugs out of your software.

PowerPoint Slides (699KB)

by Harry Robinson, 
Test Architect for Microsoft's Engineering Excellence Group

Mr. Robinson works with product teams across the company to identify and promote advanced test technologies. He writes a regular column on software testing for StickyMinds.com and hosts the Model-Based Testing Home Page at www.model-based-testing.org.

Rapid Development Methods and Their Impact on Software Quality
September 2004

Agile, Rapid, Scrum, or Dumb are all ways of describing various software development methods making the headlines these days. Are they the greatest thing since the invention of the PC or simply a label to sell books and consulting services?

Do these "NEW" methods have any effect on quality? Does a daily group hug ensure software quality? Are we any better off using the so called "rapid" techniques? Isn't this a development ploy to scope specifications out of the development process?

We have collected various experts and practitioners and formed a panel to speak on these issues and answer your questions on these popularized methods. Join us for our monthly meeting and meet our panelists.

Panel Discussion

 

Beyond the GUI:  What You Need to Know about Database Testing
August 2004

Today�s complex software systems access heterogeneous data from a variety of backend databases. The intricate mix of client-server and Web-enabled database applications are extremely difficult to test productively. Testing at the data access layer is the point at which your application communicates with the database. Tests at this level are vital to improve not only your overall test strategy, but also your product�s quality. Mary Sweeney explains what you need to know to test the SQL database engine, stored procedures, and data views. Find out how to design effective automated tests that exercise the complete database layer of your applications. You�ll learn about the most common and vexing defects related to SQL databases and the best tools available to support your testing efforts.

Acrobat Slides (515KB)

by Mary Sweeney
Exceed Training

Ms. Sweeney has been developing, using, and testing relational database systems for 20 years. She�s the author of Visual Basic for Testers (Apress, 2001) and several articles on test automation. Mary is a college professor and also performs independent consultation and training through Exceed Training. She has a bachelor�s degree in mathematics and computer science from Seattle University.

Using Threat Modeling as a Test Case Design Structure
July 2004

How do you design test cases for software security testing? How do you ensure adequate coverage of security vulnerabilities.

Threat Modeling can be used as a test case structure because it helps identify assets (things to protect), entry/exit points (ways to get at an asset), threats (risks to an asset), vulnerabilities (specific ways to execute the threat), and mitigations (ways to close the vulnerability). Risk analysis (impact vs. probability) helps measure and prioritize vulnerabilities as well as evaluating the deployment and maintenance cost of a given mitigation.

This presentation will present an overview of Threat Modeling, discuss the new Threat Modeling book (MSPress) and related tool for performing threat modeling.

PowerPoint Slides (1.5MB)

by Don Willits 
Developer Trainer, Microsoft Corp.

Mr. Willits has been working with developers, testers and program management to write secure code and to effectively threat model for several years now. He is currently a Developer Trainer at Microsoft, teaching courses on security, engineering excellence and .NET technologies. In his spare time he is preparing for the Seattle-To-Portland bike ride, hikes, surfs the web with his wife of 16 years and creates 3D animations of spaceships. 

Exploratory Testing and Session-Based Test Management
June 2004

Like the music in a jam session, exploratory testing is unscripted and spontaneous. Its agile nature makes it a widely-used and effective test method. But it is often dismissed by project managers in less agile environments because exploratory testing does not have mechanisms to measure progress, can not withstand scrutiny, and does not meet project requirements for traceability. Jon Bach discusses how managers can solve these problems using a simple, effective test measurement technique.
Acrobat Slides (318KB)

 

by Jon Bach
Managing Test Lead, Quardev Laboratories

In his 9 years of QC experience, Mr. Bach has worked on projects for Microsoft, Rational, HP, Getty Images, Captaris, and Washington Mutual. A former test manager at Microsoft, he is now Managing Test Lead at Quardev Laboratories in Seattle, taking time to speak and write for trade magazines and conferences about issues in QC. With his famous brother James, he invented Session-Based Test Management.

Objective Oriented Projects
May 2004

The focus will be on QA responsibility to ensure that connections between project activities and the meta-project (project plan, etc) are timely and to provide the information needed for successful project management.  The suggested approach is operating the project from an objective (requirements) view.
Acrobat Slides (841KB)

 

Greg Patrick, 
Principle of TechMeth, Inc.

Mr. Patrick founded TechMeth, Inc., a local consulting firm that specializes in Quality Assurance and Testing services for clients throughout the Northwest.  He has personally worked with many clients as Project Manager, QA Manager, Testing Manager and Senior Consultant on major software development and methodology implementation engagements.

 

Maturing the Testing Process Where You Work:
Individuals Can Make a Difference

April 2004

How to increase your value as an employee at work
How to personally use what you learn at a conference in your daily work
How to identify the "ideal" testing processes that will work for your company
How to help in the evolution of the company testing processes towards that �Ideal� process
How to help managers, developers and other testers at work use what you learn at conferences, seminars or in books.

Acrobat Slides (876KB)

Cordell Vail, CSTE

Mr. Vail works as an automated testing specialist at Washington School Information Processing Cooperative (WSIPC) in Everett. He has been a contractor doing automated software testing (using IBM Rational Robot and Mercury Interactive WinRunner) at Weyerhaeuser for 8 years.  Additionally Cordell has been the controller of a corporation, a software developer and a systems analyst working directly with the customers and a software tester since 1996.

 

Predicting Software Defects
March 2004

We know how to look for defects.  We know where to look for defects.  We know how to fix defects.

Do we really know when to look for defects and when to stop looking (other than by looking at the WBS, schedule and budget)? 

This presentation is a survey and discussion of various methods of predicting software defects and how such methods can improve the efficiency and effectiveness of your Quality Assurance and Quality Control.

It would be most helpful if those who have defect prediction processes and practices in their organizations or in their experience would be ready to describe and evaluate these processes.

PowerPoint Slides (187KB)

Steve Neuendorf

Mr. Neuendorf has spent over twenty years in software engineering metrics and process improvement, and the 15 years before that in various consulting, teaching, industrial engineering, and cost and management accounting positions. He is experienced with Function Point Analysis (FPA). He has designed and implemented processes that use FPA for management and improvement of activities and processes. Steve is well versed in the Software Engineering Institute's Capability Maturity Model. He is familiar with ISO standards and their use and has worked extensively with ASME commercial and nuclear quality standards. 

Mr. Neuendorf is the author of two books: Project Measurement (Management Concepts, Vienna VA, 2002) and Six Sigma for Project Mangers (Management Concepts, Vienna VA, Currently in publication).

Business Metrics for Software QA
February 2004

A significant portion of software project resources are expended revising and modifying the software after it has been initially built; in other words, rework. If our software development processes and tools are poor, we incur more rework time to get the software to the level of quality necessary. Rework is a valid business measure of how good we are at developing software. In this presentation, we will examine the relationship between rework and software QA and testing. We show why software QA and test processes are probably the most significant drivers of rework costs. �Good enough� metrics for assessing the impact of testing on rework will be demonstrated. We will also show how emphasis on reducing rework costs has a major impact on investment in testing tools and solutions. By focusing on rework, QA managers can make hard dollar business justifications for new testing tools and training.

 

C. Peter Becker
President, Software Prototype Technologies

Mr. Becker is the founder and president of Software Prototype Technologies. He has over forty years of experience in software development and has spent the last twenty-five years associated with software quality assurance and testing. Mr. Becker has played a central role in the development of tools and technologies based on Model Based Testing. Today SPT is the only company to offer an integrated model based testing solution that automates the design, documentation and execution of software test cases. Mr. Becker has authored numerous white papers and other articles on software testing and speaks frequently on the topic.

 

 

"An Approach to Managing Change or Launching a Process"
January 2004

What model do you follow when rolling out a new or improved process? How do you manage your community through the installation and adoption of a change? Whether it is a standardized process, a new piece of technology, or a better approach, it helps to have a roadmap to guide you through those bumps along the way. David Capocci will share with us one simple and common model that can be applied by anyone who finds themselves in the role of a change agent.
PowerPoint Slides  (132KB)
Change Management Strategy (68KB)

David Capocci, CSQA, CSTE, 
Senior IT Specialist
Safeco

Mr. Capocci serves on the Advisory Board for the University of Washington Software Testing Certificate program and has taught classes for the program. He coordinates Seattle exams offered by QAI through SASQAG. He has presented at QAI's International Software Testing and Extrement Testing conferences, and at the STAR conference. 

Email questions about SASQAG or this web site to webmaster at sasqag.org

Email questions about SASQAG or this web site to: webmaster at sasqag.org

Mailing Address:
Seattle Area Software Quality Assurance Group (SASQAG)
14201 SE Petrovitsky Rd
Suite A3-223
Renton, WA 98058