SASQAG Logo

Seattle Area Software Quality Assurance Group

Join the SASQAG group on Linked in Join the SASQAG group on Linked In 

Past Meetings 2007

     2015· 2014 · 2013 · 2012 · 2011 · 2010 · 2009 · 2008 · 2007 · 2006 · 2005 · 2004 · 2003 · 2002 · 2001 · 2000 · 1999 · 1998

Six-Sigma in Software

November 2007

Anyone in working in our field has probably at least heard of Six Sigma. We may have heard of Lean also, or even Lean Six Sigma. Common wisdom says it is good or even great for manufacturing or highly repetitive processes.

Uncommon wisdom, some of which was noted in the research the speaker has been involved in recently, is that there is so much repetitive process in what we consider to be "one off" or "one time" events that maybe it is worth distinguishing what is process from what is problem and trying Six Sigma on those processes too.

This participative session will cover some of those research results and engage the audience in relating their perceptions and experiences regarding Six Sigma and process and quality improvement in general. It is fitting to discuss this on the Y2K anniversary observation, since Y2K was clearly one of  the significant events pushing those organizations using Six Sigma in "unconventional areas" toward embracing what would previously been viewed as radical and high risk change. 

PowerPoint Slides (1.7mb)

Steve Neuendorf

Steve has over thirty years experience in consulting, management, industrial engineering and measurement, with twenty five years directly related to management, measurement and improvement of software engineering projects and processes. Steve also has extensive management consulting experience, BA and MBA degrees with post graduate work in information management, and a JD and subsequent practice focusing on business. Steve has extensive teaching experience ranging from academics to hands-on workshops.

For the past year, Steve has been doing a research project for NASA on management of strategy, portfolios, programs and projects in Corporate America. The study analyzed program and management structure, sizing and performance, and the influence of discovered recent and dramatic change.

Steve is the author of two books:

Six Sigma for Project Managers, 2004 Management Concepts, Inc., Vienna, VA, ISBN 1-56726-146-9

Project Measurement, 2002 Management Concepts, Inc., Vienna, VA, ISBN 1-56726-140-X

 

A Discussion Panel: The Value of Software Quality Certifications

Oct 2007

Why should a quality assurance practitioner bother with getting or renewing a certification?  Will it result in a higher salary, more job mobility, advancement, or is it just a way to feel more professional about your work and career?  Come join us for a discussion about certifications and their value in today's world. 

Audio only...no slides

Moderator:

Alejandro Ramirez, Microsoft

Panelists:

Roy Eisenbach, Boeing

Tom Gilchrist, Boeing

Robin Spisak, State of Washington, Dept. of Corrections

No Photo this month

(Not everyone was photogenic)

Test Logging, Automated Failure Analysis,

AKA: Why You Can't Afford to
Write Weak Automation.

Sept 2007

Abstract: 

One of the practical truths of test code is that it is held to a lower standard than product code.  This often leaves Test Automation difficult to maintain, untrustworthy, and failures impossible to diagnose.

Automated Failure Analysis is the method of determining if a given failure has previously been observed.  It applies a minimum standard of failure analysis to each test failure and prevents any problem tests from being mentally swept under the rug as being unreliable or untrustworthy.

One of the first steps toward the benefits of Automated Failure Analysis is to create useful meta-data as part of your logging practices. To help you improve your automation logging, Geoff has prepared a set of logging best practices for the audience.

PowerPoint Slides

Logging Best Practices Paper

Geoff Staneff, PhD., Microsoft

 

Geoff joined Microsoft in 2005 and has worked as a tester on Vista, Server 08 and the next version of Visual Studio Team Suite.  Coming late to the development cycle, Geoff has identified common weaknesses in typical test automation logging and worked on a tool to reduce the tedium of test failure analysis.

Focusing on test code quality, Geoff is now responsible for the adoption of best practices within his organization. In addition he works on the Dynamic Systems Initiative validating the Core Model Library used to describe hardware, software, and any manageable entity in the Server Management space.

Dr. Staneff received his PhD in Materials Science from the California Institute of Technology.

�ISO 25012 -  An International Standard for Data Quality�

August 2007

Abstract: 

Managing and enhancing the quality of data is essential in today�s interoperable world.  The quality of data from computer systems from various organizations, agencies, institutions that are dependent on the data is often unplanned and therefore unknown.

The purpose of the ISO-25012 standard is to prompt creators of large and small scale data bases to observe predefined criteria which will enable them to evaluate and test the quality of data, set up integrated and interoperable data bases, reduce ambiguity, avoid redundancy, promote ease of data maintenance and promote reliable, secure data bases.

Mike will talk about the evolution of IS0 25012, how data differs from softtware in its creation, maintenance, and testing and how data undergoes different processes of appraisal, cleansing, matching, transformation, and finally, archiving for display on the dashboards of the users.

ISO 25012 Info

Software Horror Stories Site

Michael Kress, Co-editor

U. S. Technical Advisory Group to ISO/IEC JTC1 SC7

Mike Kress  is an Associate Technical Fellow within Boeing Commercial Airplanes Global Partners Procurement QA.  He has over 30 years experience in military and commercial aviation hardware and software.  He has written guidebooks for the U.S. Air Force on trainer and simulator software.  He holds a Bachelor�s degree in Electrical Engineering, is a Fellow member of ASQ and holds ASQ CQE and CSQE certifications and is a Registered Professional Engineer.    He has led several Boeing and industry advisory groups that have written or contributed toward software standards, most notably RTCA/DO-178B and AS9006.    Mike is a former ASQ Software Division Regional Councilor and  past chair of the ASQ Software Division.  He is a member of the U.S. Technical Advisory Group to ISO/IEC TC176 SC7 and is co-editor of  ISO standards on COTS software and Data Quality.  He is an RAB/QSA registered QMS and Aerospace Industry Experience Auditor

Software Quality in Life-critical Products

July 2007

The U.S. Food and Drug Administration's (FDA) Quality System Regulation and the European Union's (EU) Medical Device Directive each have specific requirements for software used in the production of medical devices, for software that is a component of a medical device, and for software that is itself a medical device. Developments within the FDA and the EU may soon alter the regulatory landscape for health information systems previously considered outside the scope of medical device regulations. Attend this talk and find out how these changes may affect you.

 

Edward J. Johnson, Siemens Medical Solutions USA

Edward J. Johnson is a Compliance and Quality Analyst with Siemens Medical Solutions USA in Malvern, PA. He previously served as Director of Regulatory Strategies for Liquent, a Thomson Business, and as Regulatory Affairs Counsel and Compliance Officer for Stelex, a life sciences consulting firm. He has also been an attorney in private practice, serving as outside counsel to pharmaceutical, medical device, diagnostics software, firmware, and hardware companies.

 

No photo yet

Testing in Feature Crews

June 2007

With the next release of Visual Studio, Microsoft�s Developer Division went to a feature crew model. A feature crew is a small interdisciplinary team responsible for producing a feature. A feature is an independently testable unit of work that is either directly visible to a customer or is infrastructure that is going to be consumed by another feature. Features should be small enough that they can be worked on by a feature crew � but large enough that it actually makes sense to test it independently.

Feature crew use helps the Developer Division break down big features into smaller, manageable deliveries that are independently testable, and prevents destabilization of the product which may otherwise occur if the features are delivered as a big, end-to-end endeavor.

This presentation presents that feature crew concept and gives a practitioners perspective on working in feature crews compared to other methods of software development.  

Slides in PowerPoint (421kb)

 

Irinel Crivat, Microsoft

Irinel Crivat was born in Bucharest, Romania where she attended the University of Bucharest, and received a B.S. Mathematics. In another life Irinel spent 7 years teaching Mathematics and Computer Science. For the last 6 years Irinel has worked with Microsoft Corporation in Redmond. During this time, she went through 8 product cycles in various roles, had 6 managers, and changed offices five times in 2 different buildings.

Test Driven Development

May 2007

One of the key elements of Agile development is "Test Driven Development". But do you really understand what Test Driven Development is and why it works so well? Come to this presentation to learn what Test Driven Development is, how and why it works, how it changes the QA/Tester's job, and how it changes the developer's job. We'll also talk about how to transition from a typical testing approach into a Test Driven Development approach

 

PDF Slides handouts (62kb)

 

Steve Tockey, Construx Software

Steve Tockey is the Principal Consultant at Construx Software. He has been employed in the software industry since 1977, and has worked as a programmer, analyst, designer, researcher, consultant, and adjunct professor. Steve is the designated corporate representative to the Object Management Group (OMG, the source of the UML). During his career, which has included stints at Lawrence Livermore National Laboratory, The Boeing Company, and Rockwell Collins, Inc., Steve has obtained an in-depth knowledge of software engineering practices, including software project management, estimation, software quality techniques, object-oriented development and distributed object computing. He is widely published and has extensive experience with software development and maintenance at all levels of application, as well as knowledge of various hardware.

Steve has a Master's of Software Engineering (6/93) from Seattle University as well as a Bachelor of Arts in Computer Science (12/81) from the University of California, Berkeley. He is also an IEEE Computer Society Certified Software Development Professional (CSDP).

Steve is the author of Return on Software, a book designed to help software professionals maximize the return on their software investment.

To contact Steve Tockey, send email to steve.tockey@construx.com or call him at (425) 636.0100.

 

An Examination of Use Cases

April, 2007

How do we communicate the processes we want to analyze, code, and test?  Mark will help us understand Use Case models, what they are, and some of the components and characteristics of a good Use Case.  He will explain why they are important to defining what needs to be done and who is going to be doing it.  Mark will also give us some ideas on how to graphically represent and read use cases for development and testing.  Through real-life examples, he will show how to build real life process paths or scenarios and when to stop with modeling a Use Case.

PowerPoint handouts (1mb)

Mark Smith, Adaptis, Inc

During his software engineering career, Mark has worn many hats, including: software designer, developer, analyst, information architect, consultant, trainer, as well as several in the arena of professional services. In the Seattle area, he has most recently worked at ASIX, Edifecs, Amazon, as well as the Boeing Company. Currently, he is working at Adaptis (www.adaptisinc.com ). His primary area of expertise is in methods, tools and techniques supporting of business, system, and software requirements analysis, with a keen interest in the use of graphical notations and modeling.                                  

Currently, Mark is the Director of the Operations Support Services department at Adaptis. In this role he is directing the business analysis, procedure documentation, instructional design, and training functions supporting operations. Current projects range from business process reengineering and improvement initiatives to system enhancement work.

No Photo Yet
Performance Testing Process

March 2007

Performance Testing is a very challenging job. It is very different from functional testing and can be highly frustrating. Much of performance testing is an uphill battle and requires stellar cooperation and coordination from all involved. In order to get the job done on time, it is very essential for the performance testing team and the project team to work together and set up a well thought out performance testing process.

 In this presentation, Emily will discuss the details of the performance testing process including how to have initial discussion with project teams, how to setup goals and test plan, as well as how to work with project team to execute performance testing with project teams and help the team with performance tuning, and get performance testing done on time.  

PowerPoint Handouts (170k)

 

Emily Ren

Emily has 8 year experience in QA, specializing in Performance Testing with automation tools. She has worked in many companies for large applications. Emily prides herself on her drive to use cutting edge technologies, as well as on her insistence of applying best practices to set up strategic and efficient test processes to do an outstanding job. Currently she handles performance testing projects in T-Mobile�s IT department. Emily set up the Standard Performance Testing Process in T-Mobile, and successfully worked on over 40 projects in less than 3 years. She has made numerous presentations and trainings about performance testing process in T-Mobile.

 Prior to T-Mobile, she worked in Ernst & Young as a consultant and while there worked with many Fortune 500 companies including: Microsoft, E*Trade, HP, Johnson & Johnson, Pfizer, and many others. After Ernst& Young, she worked with major companies in Seattle including Real Networks and Washington Mutual. Emily enjoys sharing her successful

 

Risky Business � The Perils and Pitfalls of Risk-Based Testing

February 2007

Risk-based testing has become an important part of the tester�s strategy in balancing the scope of testing against the time available for testing.  Although risk-based methods have been helpful in prioritizing testing, it is important to realize that there are ways we can be fooled by risk. In this presentation, Randall Rice will discuss at least twelve ways that risk assessment and risk-based methods may fail. In addition, Randy will draw parallels to other risk-based industries and discuss the important role of contingencies to be the safety net when the unexpected occurs.

PDF Handouts (572k)

by Randall W. Rice

Randy Rice is a leading author, speaker and consultant in the field of software testing and software quality. Rice, a Certified Software Quality Analyst and a Certified Software Test Engineer, has worked with organizations worldwide to improve the quality of their information systems and optimize their testing processes. He is a popular speaker at international conferences on software testing and is also publisher of The Software Quality Advisor newsletter. He is co-author with William E. Perry of the book, Surviving the Top Ten Challenges of Software Testing and Testing Dirty Systems, published by Dorset House Publishing Co. Randy also serves on the board of directors of the American Software Testing Qualifications Board (ASTQB). He is the principal consultant and trainer of Rice Consulting Services, Inc. based in Oklahoma City, OK.

 

Build Your Own Robot Army
January 2007

Software testing is tough-it can be exhausting and there is never enough time to find all the important bugs. Wouldn't it be nice to have a staff of tireless servants working day and night to make you look good? Well, those days are here. Two decades ago, software test engineers were cheap and machine time was expensive, demanding test suites to run as quickly and efficiently as possible. Today, test engineers are expensive and CPUs are cheap, so it becomes reasonable to move test creation to the shoulders of a test machine army. But we're not talking about the run-of-the-mill automated scripts that only do what you explicitly told them � we're talking about programs that create and execute tests you never thought of and find bugs you never dreamed of. In this presentation, Harry Robinson will show you how to create your robot army using tools lying around on the Web. Most importantly, learn how to take appropriate credit for your army's work!

PDF Handouts (2.6mb)

Note: On-Demand version (which include code slides) to be available after March, 2007

Harry Robinson  
is a Software Engineer in Test for Google. 

He coaches teams around the company in test generation techniques. His background includes ten years at AT&T Bell Labs, three years at Hewlett-Packard, and six years at Microsoft before joining Google in 2005. While at Bell Labs, he created a model-based testing system that won the 1995 AT&T Award for Outstanding Achievement in the Area of Quality. At Microsoft, he pioneered the test generation technology behind Test Model Toolkit, which won the Microsoft Best Practice Award in 2001. He holds two patents in software test automation methods, maintains the site www.model-based-testing.org, and speaks and writes frequently on software testing and automation issues.

Email questions about SASQAG or this web site to webmaster at sasqag.org

Email questions about SASQAG or this web site to: webmaster at sasqag.org

Mailing Address:
Seattle Area Software Quality Assurance Group (SASQAG)
14201 SE Petrovitsky Rd
Suite A3-223
Renton, WA 98058