SASQAG Logo

Seattle Area Software Quality Assurance Group

Join the SASQAG group on Linked in Join the SASQAG group on Linked In 

Past Meetings 2010

     2014· 2013 · 2012 · 2011 · 2010 · 2009 · 2008 · 2007 · 2006 · 2005 · 2004 · 2003 · 2002 · 2001 · 2000 · 1999 · 1998
Nine Tips to Encourage Collaborative Testing

January, 2010

Collaboration among software testers effectively brings another perspective to early software evaluation and helps create an environment ideal for testing multi-user scenarios, permissions, security, custom configurations, integration, and cross-platform and cross-product workflows. In addition to providing additional opportunities to uncover bugs, it provides a chance to share product knowledge and testing techniques.

Given these advantages, why isn’t more collaborative testing being done? Because most testers don’t fully understand the value of collaboration, and most companies don’t know how to entice them to participate. Only by spelling out the benefits for testers and clarifying the value you place on their time and talent can you draw them in. This presentation will cover nine ways to make it happen, with real-world examples based largely on the presenters experiences at Adobe.

Slides for this presentation are available here.

Lanette Creamer
Adobe


Lanette (lanette.creamer@gmail.com) is quality lead for Adobe Systems, where she coordinates cross-product testing events for the company’s Creative Suites. She has 10 years of software industry experience.

Design for Testability

February, 2010

If you had an opportunity to build an application from the ground up, with testability a key design goal, what would you do?

In this presentation, we will look at just such a situation - a major, two year rewrite of a suite of core business systems. We will discuss how a system looks when testability is as important as functionality - and what it looks like when quality concerns are part of the initial design. We will look at the role of test automation and manual test in a modern project, and look at the tools and processes. The session will conclude with a demo of the latest visual test automation tool from MIT and a Q&A.

Slides are available here.

Will Iverson


Will Iverson has been writing computer software since he was a wee kid. Since then, he has worked for Apple, Symantec, SolutionsIQ, and Slalom, as well as running his own consulting company. He has written four books on software development, as well as several articles and speaking at dozens of conferences. He currently works for All Star Directories as Enterprise Architect.

Why Quality Happens - and Why Sometimes it Does Not

March, 2010

This presentation is a guided discussion of why, in some places and at some times, quality of results are assured (Quality Assurance), and insight into why sometimes it seems that it just does (can)not happen.

Slides are available here.

Steve Neuendorf

Steve Neuendorf has over thirty years experience in consulting, management, industrial engineering and measurement, with twenty five years directly related to management, measurement and improvement of software engineering projects and processes. Steve also has extensive management consulting experience, BA and MBA degrees with post graduate work in information management, and a JD and subsequent practice focusing on business. Steve has extensive teaching experience ranging from academics to hands-on workshops.

Recently, Steve did a two year research project for NASA on management of strategy, portfolios, programs and projects in Corporate America. The study analyzed program and management structure, sizing and performance, and the influence of discovered recent and dramatic change.

Steve is the author of two books:

Six Sigma for Project Managers, 2004 Management Concepts, Inc., Vienna, VA, ISBN 1-56726-146-9

Project Measurement, 2002 Management Concepts, Inc., Vienna, VA, ISBN 1-56726-140-X

And co-author of:

The 77 Deadly Sins of Project Management, 2009, MCI., Vienna, VA, ISBN 978-1-56726-246-9
 

Stop Guessing How Customers Use Your Software

April, 2010

What features of your software do customers use the most? What parts of the software do they find frustrating or completely useless? Wouldn’t you like to target these critical areas in your testing? Most organizations get feedback—much later than anyone would like—from customer complaints, product reviews, and online discussion forums. Microsoft employs proactive approaches to gather detailed customer usage data from both beta tests and released products, achieving greater understanding of the experience of its millions of users. Product teams analyze this data to guide improvement efforts, including test planning, throughout the product cycle. Alan Page shares the inner workings of Microsoft’s methods for gathering customer data, including how to know what features are used, when they are used, where crashes are occurring, and when customers are feeling pain. Learn how your organization can employ similar strategies to make better decisions throughout your development life cycle. Alan shares approaches for gathering customer data that can work for any software team —and improve the customer experience for everyone.

Slides are here.

Alan Page
Microsoft

A tester since 1993, Alan Page joined Microsoft in 1995 and currently is a Principal SDET on the Office Communicator team. At Microsoft, Alan has worked on various versions of Windows, Internet Explorer, and Windows CE, and is the former Director of Test Excellence. He is the lead author of How We Test Software at Microsoft, writes about testing on his blog, and recently contributed a chapter to Beautiful Testing. Alan speaks frequently about software testing and careers for software testers.

Collaborative Problem Resolution through Effective Communication

May 20, 2010

Emotionally, it can be a touch and go situation in a test group.  The stress of schedules, budget crunches, pressure to "get the numbers up!", and constant talking about "problems" is always a part of the landscape.  A tester needs to know how to communicate effectively within this environment.  A team of managers, developers, and testers need to know how to collaborate well to resolve issues efficiently.  What are the fundamentals of a successful collaborative environment?  What are the factors that impact effective communication?  This session will answer these questions and show you how to prioritize and deal with disruptive elements.  Several case studies will be presented giving you tangible examples of how to manage communications and collaboration within a project team environment.

Slides are here.

Paul Trompeter


Paul has a broad range of experience in the computer industry at a variety of companies including Litton, Motorola, Hewlett-Packard (Tandem Computers), Network Appliance, SDT, WellPoint, Technicolor, GDI InfoTech, and Headstrong.  Prior to moving into quality and testing, he held positions as a development programmer and manager, a project manager, and a program manager.  For the last twenty years, however, Paul has been directing enterprise-wide and global quality and testing efforts spanning many industries including aerospace, manufacturing, entertainment, financial, and healthcare. Paul holds degrees in math, technology management, and organizational change.   He has published journal articles and has participated in pre-publication reviews for several books.  Paul has also successfully conducted business in many counties across the globe including those in Asia, Australia, South America, North America, and Europe.
LifeCycle Systems™ (LC) core model

June 17, 2010

LC is a dynamic meta model for designing systems and ultimately designing any complete high quality system. The model emphasizes the importance of Quality being built into the processes and into the products. It assures a strong emphasis on testing, verification and validation against stringent standards. Poor quality, missed schedules, overrun budgets and missing requirements are not an option.

We would say new, but LC has been around and has proved itself many times over.

Don’t just take our word for it: http://www.youtube.com/watch?v=LvuzPQ44kME

LC’s Dynamic Action Plans (DAPs) ensure that models and reality merge integrity, ability and honesty.

 

Slides are here.

David Holliday



David Holliday is Chief Architect with Human Systems Knowledge Networks, Inc.; a benchmarking company in search of best practices and those interested in using them. He is a member of Who’s Who Worldwide, and a recipient of many awards, including a Presidential Citation.
Three Reasons Your Automated Test Isn't Automatic—and How to Fix It

July 15, 2010

Despite dedicated and repeated efforts by software teams to ensure that automated tests are easy to write, run, and read, system tests are often brittle, tedious, and slow. They continue to be the biggest drain on both human and computer resources and impede the software production lifecycle.   Anders Wallgren describes how the world's most innovative development organizations have come to understand how automated system test harnesses evolve and identifies three universal challenges to improving test harnesses:

  1. Provisioning and configuring the target environment
  2. Efficiently automating the invocation of complex test suites
  3. Transforming volumes of test result data into actionable metrics

 

Slides are here.

Anders Wallgren



Anders Wallgren is Chief Technical Officer of Electric Cloud. Anders brings with him over 15 years of in-depth experience designing and building commercial software. Prior to joining Electric Cloud, Anders held executive positions at Aceva, Archistra, and Impresse. Anders also held management positions at Macromedia (MACR), Common Ground Software and Verity (VRTY), where he played critical technical leadership roles in delivering award winning technologies such as Macromedia’s Director 7 and various Shockwave products.
Agile Tweaks - Finding next steps in your Agile Process Evolution

August 19, 2010

This presentation describes ways to look at your agile process in improve quality and productivity. Key areas discussed include tooling, metrics, collaboration, getting to DONE – DONE – DONE, and generally tightening your process.

 

Slides are here.

Jeff Smith



Jeff Smith is an ALM (Application Lifecycle Management) Consultant at Northwest Cadence. He is an advocate for software process improvement and has been a primary process advisor and developer for Dell Computer, BearingPoint, Boeing, LexisNexis, IBM, assorted startups, and other organizations. He has a Bachelor of Science in Electrical Engineering from Texas A&M and did post-graduate studies in Computer Science and Business Administration at the University of Texas - Arlington and University of Texas - Dallas. He has almost three decades of hi-tech and software development experience with a range of organizations, industries, and roles. Jeff is a Six Sigma Greenbelt and a Certified ScrumMaster, and is also ITIL Foundations Certified, ATSQB Certified(CTFL), and Lean+ Certified. He is the current chair of SeaSPIN.org and has served on the boards of the Austin SPIN, Association for IT Professionals - Austin Chapter, and Agile Austin.
Testing the Mobile Application's Performance - a Case Study on Mobile Devices

September 16, 2010

Mobile software usage is growing quickly, and an expanding number of consumers are expecting a consistent experience when moving from the PC to mobile devices and back again. One of the key factors enabling wide deployment and adoption of mobile applications is careful usage of system resources. High quality mobile applications must consume fewer system resources such as CPU, Memory, and most importantly the Battery life. It is crucial that design for mobile applications considers frugal and optimal resource usage as an essential aspect of application design (direct porting of desktop applications to mobile platforms is most often a recipe for failure). Knowing and understanding the factors that influence performance on mobile devices is extremely beneficial in order to plan a successful testing approach. Equally challenging is the task of measuring application performance and reporting the performance data in a useful and actionable manner.
 
We are going to present a case study on performance testing of the Microsoft Office Communicator Mobile application for Windows Mobile 6.x phones. We will discuss our approach to performance testing and the challenges involved in measuring various metrics. We will also provide details on the performance metrics, tools used to gather and present this data, and how we’ve used this data to make important decisions throughout the product cycle. Some architectural improvements have contributed to improved application performance and better resource usage. For example, the battery life has been improved by more than 300% in 2007 R2 version of the Communicator Mobile.

Slides are here

 

Rama Krishna



Rama Krishna has been working at Microsoft as a Software Development Engineer in Test for five years. He has been extensively involved in testing the Communicator Mobile application on various mobile devices including Windows Mobile 6.x series devices. Prior to Microsoft, he has worked as a Software Developer at Cisco Systems for four years. He earned a Master’s degree in Computer Science in 2001.
Why Agile Works: an Exploration of Agile, Discipline, Uncertainty, and Risk

October 21, 2010

Teams that succeed with Agile methods reliably deliver releasable software at frequent intervals and at a sustainable pace. At the same time, they can readily adapt to the changing needs and requirements of the business. Unfortunately, not all teams are successful in their attempt to transition to Agile and, instead, end up with a “frAgile” process. The difference between an Agile and a frAgile process is usually in the degree to which the organization embraces the disciplined engineering practices that support agility. Teams that succeed are often the ones adopting specific practices: test-driven development, automated regression testing, Continuous Integration, and more. Why do these practices make such a big difference? Elisabeth Hendrickson details essential Agile engineering practices and explains how they mitigate common project risks related to uncertainty, ambiguity, assumptions, dependencies, and capacity

Slides are here.

Elisabeth Hendrickson



Elisabeth Hendrickson has over 20 years software industry experienceand has been working with Agile teams since 2004. She is a Certified Scrum Master, a respected thought leader in Agile Testing, a former member of the board of the Agile Alliance, and co-organizer of the Agile Alliance Functional Testing Tools program.
A Gathering of Software Prophets

November 18, 2010

Ever wonder what a century of making software experience looks like? Are there nagging questions that plague your quality and testing efforts that you wanted answered? A gathering of Software Quality professionals from diverse industries will be available to answer your questions at the final SASQAG event of the season. The SASQAG Board members and friends will be participating in a panel discussion on software quality covering both Quality Assurance and Quality Control.

There are no slides available for this event

Panel:
Keith Stobie, Tom Gilchrist, Alan Page

Email questions about SASQAG or this web site to: webmaster at sasqag.org

Mailing Address:
Seattle Area Software Quality Assurance Group (SASQAG)
14201 SE Petrovitsky Rd
Suite A3-223
Renton, WA 98058