SASQAG Logo

Seattle Area Software Quality Assurance Group

Join the SASQAG group on Linked in Join the SASQAG group on Linked In 

Past Meetings 2009

     2015· 2014 · 2013 · 2012 · 2011 · 2010 · 2009 · 2008 · 2007 · 2006 · 2005 · 2004 · 2003 · 2002 · 2001 · 2000 · 1999 · 1998
The Future of Testing

Jan 2009

If there was a World Cup of Product Failure, surely software would be a strong contender for the title. What other product has such a poor track record of quality yet literally has successfully embedded itself in nearly all aspects of our lives? And the future looks even more digital with computing tables, walls, countertops and RFIDs embedded in nearly everything we buy. How will we test the software of tomorrow when we struggle with the software of today? Join James as he presents a vision of software testing in the future.

James will detail several compelling predictions of the future of testing and how that future will impact the daily work of the software tester. Some of the predictions are actually in progress now and others are 10+ years out. However, the technology to realize James� vision requires no substantial leap, the foundation to build it is here today. What is necessary is that software professionals look beyond the day-to-day testing grind with a keen eye for innovation and opportunity.

Presentation Slides

James A. Whittaker

Principal Architect,
Microsoft Corporation

James A. Whittaker, currently a Principal Architect at Microsoft, has spent his career in software testing. He was an early thought leader in model-based testing where his PhD dissertation from the University of Tennessee became a standard reference on the subject. While a professor at Florida Tech, he founded the world�s largest academic software testing research center and helped make testing a degree track for undergraduates. Before he left Florida Tech, his research group had grown to over 60 students and faculty and had secured over $12 million in research awards and contracts. During his tenure at FIT he wrote How to Break Software and the series follow-ups How to Break Software Security (with Hugh Thompson) and How to Break Web Software (with Mike Andrews). His research team also developed the highly acclaimed runtime fault injection tool Holodeck and marketed it through their startup Security Innovation, Inc. James is a frequent keynote speaker and has won numerous best-presentation awards. His blog can be found at http://blogs.msdn.com/james_whittaker
Ideas for Rapid Test Management

Feb 2009

Jon Bach assumes that whether you're a tester or a test manager, you have no time to do the things you want to do. Knowing that even the things you *need* to do will compete for your attention, he has some ideas to keep it all straight. It's not about time management, it's about where your energy goes. In this talk, Jon will share the ideas that seem to be working for him as a test manager of 15 people on 3 projects at LexisNexis. His ideas are meant to solve common problems in test execution, reporting, measurement, and personnel -- all of which are low (or no) cost and relatively easy to implement.

Presentation Slides

Jon Bach

Lexis Nexis

Jon Bach has been a software tester for more than 13 years, and is currently a Senior QA Manager at LexisNexis. He speaks frequently about exploratory and rapid testing, and is the co-inventor of session-based test management.
Beneath, Between, and Behind the lines of Test Excellence at Microsoft

March 2009

There are over 35,000 software engineers at Microsoft, including nearly 10,000 testers. What in the world could all of these people possibly be doing, how much of it is common to all of Microsoft, and what successful testers at Microsoft do?

In this conversation with Alan Page (Microsoft Director of Test Excellence and author of How We Test Software at Microsoft), we�ll discuss the answers to these questions and dive into the back story of how a former music teacher and bicycle messenger ended up documenting the testing practices of one of the most successful companies in the world. This will be a highly interactive session, so bring your questions and be ready for a fun conversation about testing and the paths that brought us all to this great career.

Presentation Slides

Alan Page
Director, Test Excellence


Microsoft

Alan Page began his career as a tester in 1993. He joined Microsoft in 1995, and is currently the Director of Test Excellence, where he oversees the technical training program for testers and various other activities focused on improving testers, testing, and test tools. In his career at Microsoft, Alan has worked on various versions of Windows, Internet Explorer, and Windows CE. Alan writes about testing on his blog , and is the lead author on How We Test Software at Microsoft (Microsoft Press, 2008, http://www.hwtsam.com).
Debugging for Sport: How to Effectively Find, and Prevent Bugs in your Organization

April 2009

Software bugs are not problems that only impact IT departments, developers, and direct users. A software problem can have a broader impact on a company and brand. Companies ignore this at their own peril. The presentation presents and reviews the top ten bugs of all time and their impact. For example, the Intel Pentium FDIV, COMAIR, and NASA bugs are discussed. Next the top ten bug prevention techniques are discussed, such as encouraging education and maintaining updated architecture and design documents. Finally, the top debugging tools are listed with their place in the debugging process.

Presentation Slides

Donis Marshall

Donis Marshall is a premier and recognized trainer of computer technology to developers and scientists. Donis is an endorsed trainer for Microsoft Global Learning Services. In this role, he has trained Microsoft developers and engineers for nearly fifteen years. He has both extensive native (unmanaged) and managed code experience.

Donis is also the author of the best selling Visual C# book from Microsoft Press entitled, Programming Microsoft Visual C# 2005. He is also the author of .NET Security Programming and Directory Services Programming for Windows both published by Wiley Worldwide Books, as well as the author of Active/OLE Programming with MFC published by R&D Books. His latest book, Solid Code from Microsoft Press, just arrived in bookstores.

In recent years, Donis has focused on native and managed debugging. He has taught debugging related classes and concepts around the world to many of the leading software companies. He is presently the President of DebugLive. DebugLive is introducing a suite of debugging tools, most notably a web-based debugger for debugging local and remote applications. Donis has a diversified background and training experience in a variety of technologies including: Visual C#, Common Language Runtime, Visual Basic .NET, .NET Security Programming, .NET Patterns and Architecture, C++ Programming, MFC Programming, Win32 SDK Programming, COM Programming, etc.

From 1999 to 2003, Donis was the President of Gearhead Press and Consulting Editor to Wiley Worldwide Books. Gearhead Press published more than two dozen books to IT engineers and developers. Prior to Gearhead Donis worked as the Director of Advanced Technical Learning Solutions at Productivity Point International (from 1998 to 1999) where he directed advanced technical training throughout the PPI training network.

From 1989 through 1997, he was President and Senior Instructor of the Training Alliance. The Training Alliance was a Microsoft Authorized Training Education Center and maintained offices in Charlotte, NC, Raleigh, NC, Columbia, SC, and Charleston, SC. He taught windows programming at companies such as: Oracle, Tandem, IBM, AutoDesk, Xerox, Nortel, and many others.
Add Antirandom Testing to your Skill Set

May 2009

Antirandom testing is a variation of pure random testing where each new test case added to a collection of test cases is maximally different from those cases already in the collection. Topics covered include: origins of antirandom testing, the difference between random testing, pure antirandom testing, and partial antirandom testing, and studies of the effectiveness of antirandom testing. You will leave this presentation with a solid understanding of what antirandom testing is and when the use of antirandom testing is appropriate.

Presentation Slides Coming Soon

Dr. James McCafrey
Senior Director


Volt

Dr. James McCaffrey works for Volt Information Sciences and oversees several technical programs for software engineers working at Microsoft�s Redmond, WA campus. James has worked on several key Microsoft products including Internet Explorer and MSN Search. He holds a doctorate from the University of Southern California, and degrees in mathematics and psychology from California State University at Fullerton, and the University of California at Irvine. James is a Contributing Editor for Microsoft�s MSDN Magazine where he writes the �Test Run� articles on software test automation. McCaffrey is the author of �.NET Test Automation Recipes� (Apress Publishing, 2006), and �Software Testing: Fundamental Principles and Essential Knowledge� (BookSurge Publishing, 2009).
Rediscover yourself. Careers and certifications in software quality that work for you.

June 2009

There is nothing like waking up every morning and feeling energized about work and life. Self-awareness is more than ever, one of the most powerful means to get ahead of the crowd professionally and personally. But that's just the start; to be successful you need to make others aware of your professionalism and understand what it means for organizations, products, processes, and how it impacts the customer.

Presentation Slides
Certificate Comparison (excel)
Certificate Comparison (pdf)

Alejandro Ramirez


TEKSystems / Boeing CAS

Alejandro Ramirez has been involved in education and corporate training for 18 years in areas like Software Testing, Programming, Computer Applications, Volleyball, Music, and English as a Foreign Language. He has 11+ years of professional experience in the IT field and a master�s of science degree in Instructional Technology & Telecommunications from Western Illinois University with specialty in distance education and eLearning. He is actively involved in training, workshops, and study groups for certifications delivered online, remotely, or in-person. At the moment he is SASQAG�s certifications coordinator and a QA Engineer for Boeing CAS through TEKsystems.
Data-Driven, Experiential SPI

July 2009

So much SPI is evangelism, attempting to impose the one true way from outside. That tends to fail, and creates at best compliance.

Process improvement that works treats SPI what it is - a kind of experiential learning, centered on the team doing software development, supported by data about the local practices & results. The team learns how to do things differently, perhaps drawing candidates from an external catalog of techniques.

This talk draws from about a dozen personal experiences with process & practice changes plus industry examples with reference to antecedents like TQM, and Deming and current frameworks like "Lean" and Critical Chain.

  • Supporting the fact of change
  • "Your process is what you do"
  • Process development is knowledge formation.
  • Relevance: mission and data and mission and data and ...
The talk finishes with a strategy that works for explicit, ongoing SPI.

This material is part of the book "Change in Technology Development Organizations: Learning On Purpose", which Jim Bullock has been writing at for some time.

Slides for this presentation are available here.
James Bullock


James Bullock has been successfully building systems for more than 20 years. In that time he has built high-volume embedded control software, automated plant-floor manufacturing, architected enterprise data warehouse systems, created tools used to manage multi-million SLOC tactical and commercial systems, run technology departments in Internet-based businesses, and shipped multiple releases of innovative SW products for the enterprise.

Through this varied experience, James has remained more interested in how systems are built than in the systems themselves. He has written on subjects such as the development system as a system, the value of testing as a function in a business, software tools and methods in e-commerce, database performance tuning, and how software projects differ from other projects. He is the lead editor of Roundtable on Project Management and coeditor of Roundtable on Technical Leadership, both published by Dorset House.

A Seattle resident, since 2002 James has focused on "conscious software development" guiding clients in purposefully changing how they develop the software they depend on. He is currently developing presentations of general systems thinking in software engineering practice and teaching. James still occasionally builds software or does automated testing because, he says, "I like the toys."
Too much automation or not enough? When to automate testing.

August 2009

Fundamentally test automation is about Return On Investment (ROI). Do we get better quality for less money by automating or not automating? The obvious and famous consultant answer is �it depends�. This paper explores those factors that influence when to choose automation and when to shun it. Three major factors you must consider:

  1. Rate of change of what you are testing. The less stable, the more automation maintenance costs.
  2. Frequency of test execution. How important is each test result and how expensive to get it?
  3. Usefulness of automation. Do automated tests have continuing value to either find bugs or to prove important aspects about your software, like scenarios?


Slides for this presentation are available here.
Keith Stobie


Keith Stobie is a Test Architect for Protocol Engineering team at Microsoft working on model based testing (MBT) including test framework, harnessing, and model patterns. He also plans, designs, and reviews software architecture and tests. Previously Keith worked in Microsoft's Windows Live Search live.com and XML Web Services group. With twenty five years of distributed systems testing experience Keith's interests are in testing methodology, tools technology, and quality process. Keith has a BS in computer science from Cornell University.
ASQ Certified Software Quality Engineer, ASTQB Foundation Level Member: ACM, IEEE, ASQ
Score One for Quality: Using Games to Improve Product Quality

September 2009

Doing research into the generation gap between current managers (from the Baby Boomer era) and the incoming group of Gen X, Gen Y, and Millennials, we find that there is a lot of work demonstrating the effect of video games on younger employees. Taking that slant, we set out to improve the legacy concept of a bug bash or simple leader board-driven, single-task-oriented game into something richer that would help drive greater engagement among all employees.

What we found, however, was a very powerful mechanism for communicating organizational priorities effectively and quickly. Not only can we help people feel engaged, we can drive behaviors using games that help improve both product quality as well as morale. This leads to a virtuous cycle where standard productivity metrics also improve as engagement improves. Our latest game is driving a new level of quality into our localized products by leveraging Microsoft�s diverse worldwide employee base. We predict that �Games at Work� or �Productivity Games� carry a huge potential for influencing not just the software engineering workplace, but also all types of companies and work.


Slides for this presentation are available here.

Joshua Williams
Microsoft

Joshua Williams, Test Architect, Windows Defect Prevention at Microsoft Corporation, has loved the personal computer for over 20 years, and has been worked to improve the user experience on PCs for the past 15 years. His work testing Windows has spanned releases from Windows 95 through Windows 7, and nearly everything in between. His work has ranged from globalization efforts to improve the quality of non-English versions of Windows to improving driver quality for Universal Serial Bus support to designing and implementing large-scaled test automation systems. Three years ago, Joshua changed focus to work on strategies to improve software quality throughout the entire software lifecycle and projects focused on making work more enjoyable. His work with 42Projects (www.42projects.org and on Facebook) has certainly brought �buzz back to the hallways� he inhabits. Most recently, working with productivity games and exploring how games and fun can help get work done motivates him to learn a little more each day.
Score One for Quality: Using Games to Improve Product Quality

October 2009

What do you do if you want to improve a process and you have 100 factors that are candidate predictors? How do you decide where to direct your causal analysis effort? Similarly, what if you want to create an estimating model or a simulation, and you have so many factors you do not know where to start? Data mining techniques have been used to filter many variables down to a vital few in order to focus causal analysis and build model-based estimates. Specific software engineering examples are provided in four categories: classification, regression, clustering, and association.

When creating a predictive model to understand a process, the primary challenge is how to start. Regardless of the variable being estimated (e.g., effort, cost, duration, quality, staff, productivity, risk, size) there are many factors that influence the actual value and many more that could be influential. The existence of one or more large datasets of historical data could be viewed as both a blessing and a curse: the existence and accessibility of the data is necessary for prediction and learning, but traditional analysis techniques do not provide us with optimum methods for identifying key independent (predictor) variables from a large pool of variables. Unfortunately, the Lean Six Sigma body of knowledge does not include data mining as a subject area. Data mining techniques can be used to help thin out the forest, so that we can examine the important trees.

Slides for this presentation are available here.

Paul Below
EDS

Paul Below has over 25 years experience in the subjects of measurement technology, statistical analysis, forecasting, Lean Six Sigma, and data mining. He has provided innovative engineering solutions as well as teaching and mentoring internationally in support of multiple industries. He serves as analyst for EDS, an HP Company, where he provides executive leaders and clients with statistical analysis of operational performance, helping strengthen competitive position through process improvement and predictability.

Mr. Below is a Certified Software Quality Analyst and a past Certified Function Point Specialist. He is Six Sigma Black Belt. He has been a course developer and instructor for Estimating, Lean Six Sigma, Metrics Analysis, Function Point Analysis, as well as statistics in the Masters of Software Engineering program at Seattle University. He is a member of the IEEE Computer Society, the American Statistical Association, the American Society for Quality, the Seattle Area Software Quality Assurance Group, and has served on the Management Reporting Committee of the International Function Points User Group. He has one US patent and two pending.
Large-scale Exploratory Testing: Let�s Take a Tour

November, 2009

What do you do if you want to improve a process and you have 100 factors that are candidate predictors? How do you decide where to direct your causal analysis effort? Similarly, what if you want to create an estimating model or a simulation, and you have so many factors you do not know where to start? Data mining techniques have been used to filter many variables down to a vital few in order to focus causal analysis and build model-based estimates. Specific software engineering examples are provided in four categories: classification, regression, clustering, and association.

Manual testing is the best way to find the bugs most likely to bite users badly after a product ships. However, manual testing remains a very ad hoc, aimless process. At a number of companies across the globe, groups of test innovators gathered in think tank settings to create a better way to do manual testing�a way that is more prescriptive, repeatable, and capable of finding the highest quality bugs. The result is a new methodology for exploratory testing based on the concept of tours through the application under test. In short, tours represent a more purposeful way to plan and execute exploratory tests. James Whittaker describes the tourist metaphor for this novel approach and demonstrates tours taken by test teams from various companies including Microsoft and Google. He presents results from numerous projects where the tours were used in critical-path production environments. Join James and learn about the collection of test tours, test cases, and bugs from these case studies and take back recommendations for using tours on your own products.


Slides for this presentation are available here.

James Whittaker
Google

James Whittaker has spent his career in software testing. He was an early thought leader in model-based testing where his Ph.D. dissertation became a standard reference on the subject. While a professor at the Florida Institute of Technology, James founded the world�s largest academic software testing research center and helped make testing a degree track for undergraduates. While at FIT, he wrote How to Break Software and the series follow-ups How to Break Software Security (with Hugh Thompson) and How to Break Web Software (with Mike Andrews). As a software architect for Visual Studio� Team System at Microsoft, James transformed many of his testing ideas into tools and techniques for developers and testers and wrote the book Exploratory Software Testing. He is currently the Test Engineering Director for the Kirkland and Seattle offices of Google where he�s busy forging a future in which software just works.

Email questions about SASQAG or this web site to webmaster at sasqag.org

Email questions about SASQAG or this web site to: webmaster at sasqag.org

Mailing Address:
Seattle Area Software Quality Assurance Group (SASQAG)
14201 SE Petrovitsky Rd
Suite A3-223
Renton, WA 98058