Seattle Area Software Quality Assurance Group

Join the SASQAG group on Linked in Join the SASQAG group on Linked In 

Past Meetings 2012

     2015 ·2014 ·2013 ·2012 ·2011 ·2010 ·2009 ·2008 ·2007 ·2006 ·2005 ·2004 ·2003 ·2002 ·2001 ·2000 ·1999 ·1998
Playback Testing Using Log Files

February, 2012

Most testers are familiar with record/playback testing. A variation of this approach is log playback testing. Instead of explicitly recording the user actions using a recorder, this technique simply relies on the information in the log file produced by the system under test to play back the scenario. This is extremely useful for reproducing issues reported by customers which are typically hard to reproduce otherwise due to lack of detailed repro steps. This approach also lends itself well to creating data driven tests and can be used to augment the existing test automation. Testing and investigating failures in a non-deterministic software system can be very costly and painful. In most cases the only source of information to know what code path got exercised are the log files produced by the application. This paper talks about how the information in log files can be used to playback the sequence of actions to test specific code paths and also reproduce bugs. We will look at exploring an extension of this approach and how log files can be used as a source for data driven testing by simply varying the sequencing of actions that are played back. This paper illustrates the technique using real a world example. Finally the paper discusses the benefits and how it can be applied to different test domains.

Slides posted soon

Vijaya Upadya

Vijaya Upadya is a senior software development engineer in test at Microsoft, currently working at the Microsoft main campus in Redmond, Washington. Over the past decade, he has been involved in software development and testing for products ranging from compilers to libraries like Linq To SQL, Silverlight and RIA Services to Windows Live mesh. Vijay primarily focuses on test strategy, test framework and tools and test frameworks for the team. Vijay has a Master’s degree In Systems Analysis and Computer Science from the Mangalore University, India
Signals in Testing

March, 2012

This talk outlines the signals that indicate radical changes in software testing and quality for the next five years. Industry signals point to significant changes in how we will test software. The leading indicators are: number of web and mobile applications, continuous build and ship cycles, app stores, importance of location and social context, complexity of operating system and device matrices, and the immediacy and the empowerment of end users to share feedback.

These signals point to a strong increase in demand for testing overall, but the testing required is disruptive to existing approaches, best practices, organization and tooling. The methods that are showing promise in this new world are: developers owning quality and component test development, infrastructure to support deployment and feedback signals, analysis of public and private quality signals, generalized application scanning techniques, and aggressive leveraging of early adopters and crowd-based testers for bandwidth and diversity.

Slides SignalsInTesting.pdf

Jason Arbon

Jason Arbon ( has worked in test and development roles at a wide range of companies. He is currently an Engineering Director at Before that, he held test leadership and innovation roles at Google, working in the centralized testing team, and on projects such as Google+, Chrome Browser, Chrome OS, Google Desktop, and Google Talk. He also worked on teams at Microsoft including Bing (Search relevance and QnA), WinFS, BizTalk Server, MSN, WindowsCE and Exchange Server. Along the way, he co-founded a small social search startup before it was cool, and worked at several small and mid-size companies. He just wrapped up co-authoring the book ‘How Google Tests Software’. Jason received degrees in Electrical Engineering and Computer Engineering form the University of Utah.
Rescuing a Troubled Project

April, 2012

In this talk I will speak about a recent project that I was asked to rescue. I will detail many of the challenges we faced and more importantly what caused them in the first place. I will talk about how we solved these issues and some common sense strategies that people can use to avoid and/or resolve these kinds of issues on their projects. This information will be applicable to anyone involved in a software development/deployment project and will give testers a broader view into the project lifecycle.

Slides are here

Daniel Williams

Daniel Williams is not only an outstanding tester with years of testing experience - he's our host at Compucom (Excell) in Bellevue.
A Consultant’s Perspective - Common Performance Testing Mistakes

May, 2012

Reliance on technology in all aspects of our lives is driving greater need to measure the speed and/or effectiveness of the network and software programs as well as a wide range of devices. Demand for Performance Testing has never been greater and only looks to grow for years to come. This presentation will address common mistakes that are made in the application of performance tests.

Subjects that will be addressed include:

  • Mimicking real user patterns
  • Trying to mimic more traffic with fewer virtual users
  • Environment (AUT as well as Automation Servers)
  • Scripting
  • Test team
  • Doing too much with one test
  • Not understanding the application
  • Incomplete protocol traffic
  • Pass/Fail Criteria
  • Monitoring
  • Analysis

Slides posted soon

Raul DeLeon,
Vice President, Quality Assurance/Testing Practice, Q Analysts LLC

Raul has 27 years of software engineering experience and provided innovative and high-value solutions. For the past 12 years he has focused on Quality Assurance and Testing and has helped overcome a wide range of complex QA challenges globally for customers including Wal-Mart, American Express, AT&T, China Construction Bank, Hewlett-Packard, EDS, Perot Systems, Harley Davidson, Seagate, Dell and the IMF. Raul is an expert at the highest level on Hewlett-Packard’s suite of testing solutions and has partnered with HP R&D and PSO for many years to solve daunting technical problems and to provide valuable insight into how the products are actually used by their customers. Certified in ITIL, Raul oversees all aspects of Q Analysts’ Quality Assurance/Testing Practice.

About Q Analysts
Q Analysts delivers Quality Assurance and Testing solutions, Business Intelligence and Data Warehousing solutions and IT Professional Services to Fortune® 500 companies. A certified Minority Business Enterprise, Q Analysts is headquartered in Santa Clara, CA and for the second consecutive year is one of America’s 10 fastest growing firms in the industry, according to Staffing Industry Analysts.
Software Test Design through Behavioral Modeling

June, 2012

Author Shel Prince presents a groundbreaking methodology for software-testing design that assures fewer bugs and thoroughly tested software applications in less time, and with less hassle, than traditional methods. The breakthrough involves Behavioral Modeling, a technique that produces the minimum size test suite for the maximum testing coverage. Addressing the three biggest problems facing anyone responsible for testing software—time constraints, too many bugs, and inadequate requirements—author Prince offers systematic and concrete solutions to these problems providing a scalable, actionable approach valid in an enterprise or boutique development environment. Sure to become a classic in the world of software testing best practices, Software Test Design through Behavioral Modeling, removes the guesswork from test suite design and empowers end-users, developers, testers and project managers with the tools they need to move their projects into go-live status with confidence.

The book is the basis for the SASQAG presentation June 21. Should folks happen to have a copy, the author will be happy to sign them.

Slides Behavioral Modeling SASQAG.pptx

Shel Prince
QA Director at T-Mobile

RShel Prince has been in the software quality field for more than forty years. Prior to joining T-Mobile, Shel consulted with clients both large and small as well as governments in many parts if the world.
Slow down to go fast - Tortoise Overtaking the Hare

July, 2012

Are you the tortoise or the hare? Many software teams today practice "rapid development" techniques, and while it is true that some practices and methods produce results more quickly than others, slowing down and (potentially) building less, leads to higher quality and efficiency gains in the long run. At Microsoft, "Slow down to go fast" was the mantra of the Xbox voice search team, taking on the persona of the "tortoise" to successfully deliver on our objectives with high quality. By adhering to the principles of Agile while being agile in our methodologies and practices, the team found a flexible but structured framework to work within. Paramount to this collaboration was the close partnership between developers and testers. Out of this collaboration grew a set of philosophies and best practices that lead to a successful product launch. Come hear a development lead and senior tester talk about went well and what they learned, with a focus on slowing down to go fast, improving early cycle - and ultimately shipping - quality.

In this session, participants will learn the what and why of our best practices, including:

  • The "Buddy System" employed by development and test
  • Code reviews, buddy building and buddy testing
  • Interactive design and documentation using OneNote
  • Driving crisp and test-driven "done" definitions
  • The right level of unit testing and test automation with feedback from code coverage

Slides Slow Down to Go Fast - Agile 2012.pptx
James Waletzky and Randy Santossio

James Waletzky is a principal consultant at Crosslake Technologies, which helps companies transform their product strategy, organization, process, and tools to drive improved product value and productivity. Previously, James was a lead developer at Microsoft, who was responsible for delivering the Bing voice search, voice user interface, biometric identity, and Kinect retail kiosk features in the December 2011 Xbox update. He has over 8 years of experience with agile practices and methodologies, including Scrum, Test-Driven Development, and emergent design, to name a few. During his decade at Microsoft , he spent 3 years on the Microsoft Engineering Excellence team teaching developers how to be more effective. Before Microsoft, he was a development manager at NCompass Labs, where he developed Content Management Server (formerly NCompass Resolution).

Randy Santossio is a senior SDET at Microsoft, who worked on delivering the Bing search experience in the December 2011 Xbox update. Over the last 8 years at Microsoft, he worked on different technologies for Xbox, Windows, and in-game features for Gears of War 2 and Lips, while utilizing different variations of SCRUM and other agile methodologies. During the last two decades, he has spent time in developer and tester focused positions as a director for large technology teams and a senior individual contributor for technical roles, across various companies, including Microsoft, Symantec, and SmartServ Online.
Transitioning to Scrum Framework with Distributed/Offshore Teams

August, 2012

Mark hopes to stimulate some discussion by describing Cobalt's process change. What do QA/QC staff do for teams using agile processes? He'll discuss current successes and challenges with Scrum Framework. He'll also talk about working with offshore staff in a partnership based on trust.

Slides ScrumTransition.pptx or PDF ScrumTransition.pdf
Mark Bullock
ADP Digital Marketing Solutions aka Cobalt

Mark Bullock serves as Senior QA Manager at ADP Digital Marketing Solutions aka Cobalt. He's worked in QA and QC on web applications for more than 15 years. Mark worked as a developer for more than 10 years prior, including a near real-time system to analyze data for NASA wind tunnel experiments. He's always been interested in avoiding defects and finding defects as early as possible.
Life After Testing

September, 2012

After giving his ‘test is dead’ presentation at STAR West in 2011, James had no choice but to walk the walk. With 300+ devs on his team, how are they doing without testers? What has happened to quality? Culture? Review scores? Come hear his thoughts on what is working and what isn’t and where he thinks testing is headed in the future.

Slides Life After Testing.pptx
James Whittaker

James Whittaker is a technology executive focused on making the web a better place for users and developers. He is a former Googler, former professor and former startup founder. Follow him on Twitter @docjamesw
Succeeding with Behavior Driven Development (BDD) Testing and Automation

October, 2012

Behavior driven development expresses examples of product use in human language and uses tools to execute those examples as tests. Even before code is written, BDD tests can remove ambiguity from requirements. Once automated, BDD tests can provide a reassuring mapping of tests to customer acceptance requirements. I will talk about my successes (and failures) applying BDD tests at Google and Microsoft, survey tools for BDD, and provide guidance for success.

Slides Succeeding with Behavior Driven Development (BDD) Testing and Automation.pptx

Audio recording 2012-10-18-SASQAG.mp3
Alan Myrvold

Alan Myrvold is a test engineer with Google in Kirkland, working on advertising products. Prior to Google, Alan worked at Microsoft within the Office division, and in Canada at Entrust and Cognos.
Common Engineering Principle and Practices

November, 2012

In this talk, Qingsong Yao will discussion a set of commonly used engineering principle and practices, namely, Continuous Integration, Continuous Delivery, and various Testing Strategies, which you can use in your daily work. These principles helped many organizations to release their product fast with high quality and were fully integrated into Agile process. He will show demos about how these engineering principle and practices are used in his DAC Import/Export improvement. In additional, he will show how organizations and companies, like Bing team, Protection Services Team, and Google use these principles.

Slides serviceEngineering2.pptx

Qingsong Yao

Qingsong Yao is a Senior Tester in SQL Server Service Excellent Team. He graduated from York University with PhD Degree in Computer Science. Starting from 2005, Qingsong is a member of SQL Server XML Testing Team. He has involved the development of a wide range of features since then, including Katmai NewDateTime, Sparse Column, Apollo, DAC In-place upgrade, and database logical import/export. His expertise are Scenario Based Testing, Testing Strategy, Testing in Production and Growing team members. He likes to read test books, blogs, and he wrote blogs at >http://my/sites/qyao/Blog/default.aspx to share thoughts about testing and service development.

Email questions about SASQAG or this web site to: webmaster at

Mailing Address:
Seattle Area Software Quality Assurance Group (SASQAG)
14201 SE Petrovitsky Rd
Suite A3-223
Renton, WA 98058