SASQAG Logo

Seattle Area Software Quality Assurance Group


Join the SASQAG group on Linked in Join the SASQAG group on Linked In 

Past Meetings 2013

     2015 ·2014 ·2013 ·2012 ·2011 ·2010 ·2009 ·2008 ·2007 ·2006 ·2005 ·2004 ·2003 ·2002 ·2001 ·2000 ·1999 ·1998
Applying Agile/Scrum Methodology in Performance Testing to Deliver Quality

January, 2013

Methodology in Performance Testing is not as clear and mature as Functional Testing in many companies. Performance Testing is a very challenging job. It is very different from functional testing and can be highly frustrating. Much of performance testing is an uphill battle and requires stellar cooperation and coordination from all involved. It is essential for the performance testing team and the project team to work together and set up a well thought out performance testing process with proper methodology.

Agile/Scrum methodology is a natural fit for performance test process because:

  • Performance testing is a whole team effort.
  • Agile/Scrum methodology can help integrate testing, development, monitoring and tuning activities.
  • Agile/Scrum methodology can promote experimental performance test design and proactive tuning.
  • Agile/Scrum methodology can help minimize documentation and consequently reduce total testing time.

This presentation discusses the common mistakes in Performance Test processes, and how to apply Agile/Scrum methodology for the team to work effectively to optimize the outcome of performance testing and tuning activities.

Slides

Emily Ren
Cognizant

Emily Ren has 13 years experience in QA, specializing in Performance Testing Process and automation tools. She has worked with many companies for large applications. Emily prides herself on her drive to use cutting edge technologies, as well as on her insistence of applying best practices to set up strategic and efficient test processes to do an outstanding job.

Currently, she works as an Associate Director at Cognizant’s Performance, Architecture, Consulting and Engineering (PACE) NA practice. Prior to Cognizant, she did consulting in Performance Testing Assessments and Roadmap Development for large organizations. She has worked with many Fortune 500 companies including: Ernst & Young, Cap Gemini, Sogeti, T-Mobile, Microsoft, Boeing, E*Trade, HP, Johnson & Johnson, Pfizer, Expedia, Dell, LexisNexis, Washington Mutual, Symetra, etc. She has given presentations and training on performance testing processes, including “Implement Efficient Performance Testing Process” at SASQAG (Seattle Area Software Quality Assurance Group) and an IEEE conference. She is also a certified PMP.

Emily Ren
Patterns and Anti-Patterns in Acceptance Test Driven Development

February, 2013

We'll explore what went well and what went poorly in some implementations of ATDD, distilled into patterns and anti-patterns of ATDD. We'll cover both technology and process, examining a few different tools (BDD, Fit, etc) and usages of tools as well as how ATDD supports or is in conflict with Lean/Scrum processes and principles.

This presentation discusses the common mistakes in Performance Test processes, and how to apply Agile/Scrum methodology for the team to work effectively to optimize the outcome of performance testing and tuning activities.

Slides or PDF

Kevin Klinemeier
Davisbase Consulting

Kevin Klinemeier is a Developer, Architect, and Agile Coach working for Davisbase Consulting. He comes to agile development through a path of skepticism, hard knocks, and key projects. Prior to current consulting work, Kevin was the Software Architect for local national ISP Speakeasy, and team lead at Expeditors International.

In his spare time, Kevin likes to sail on Puget Sound, snowboard in the mountains, spend time with his daughter, do minor repairs in on low-earth satellites, and tell occasionally amusing jokes. One (possibly two) of these things is not true.

Email: kevin.klinemeier@davisbase.com
Twitter: @agilekevin
Blog: http://zipwow.blogspot.com

DevOps - Containing Complexity and Delivering Continuous Deployment

March, 2013

The stories of quick, ongoing deployments at Flickr, Facebook, and Google are well-known. Many product teams today are able to mimic these capabilities. In many cases, the financial risk is not necessarily too large and solid engineering practices go a long way toward managing other risks, to allow for flowing updates. We have some solid experiences to point at and guidance in some solid texts. But there are companies still struggling to get there - usually bigger, older companies - and often for good reasons.

While some would simply argue that excuses and fear keep teams from making progress toward more continuous delivery, this is usually oversimplifying. Stepping into the situation and seeing the needed evolution can help us understand the challenge of size, interdependence, and risk... when too many complexity sources impede near-term continuous delivery.

This presentation will walk the path of some real transitions and some teams who made it work, and some that still struggle against industry and internal realities. It will suggest the many aspects that constrain fast delivery and will offer thoughts on how to move forward with your DevOps initiative. the audience is encouraged to bring their own questions and insights, and also to contemplate how QA fits in. Seeing how organizations and products differ - and starting to understand the challenges and leverage points - is perhaps the best path toward coherent direction and real improvement for challenged teams.

Slides

Jeff Smith
Chair, Seattle & Eastside Software Process Improvement Network

Jeff Smith has been in software development for over 20 years and specializes in driving internal development productivity breakthroughs. He is the current chair for the Seattle and Eastside Area Software Process Improvement Network(SPIN), and has also served on the board of the Seattle ISPI, Austin SPIN, Agile Austin, and the Austin AITP Chapter (Association of IT Professionals) Chapter. Jeff's experience includes process and development work and consulting with organizations large and small, including Vertafore, NetObjectives, Dell, IBM, BearingPoint, Boeing, Trilogy, Lexis Nexis, as well as some smaller technology startups. He holds a Bachelors in Electrical Engineering from Texas A&M University, a ScrumMasterCertification, with additional certifications in Lean, Six Sigma, ITIL, and software test.

Test Innovation for Everyone

April, 2013

The software tester's nature for system thinking, and for identifying problems and patterns makes them well-suited for innovation, yet few testers take the time to apply their skills and experience to this end. Successful innovation is not purely a matter of skill, intelligence, or luck. Innovation begins with careful identification and analysis of a problem, obstacle, or bottleneck; followed by a solution that not only solves the problem, but frequently solves it in a way that has widespread benefit - or in a way that changes the basic nature of the problem entirely.

Alan Page breaks down the cogs and wheels of innovation and shows examples of how some testers are applying game-changing creativity to discover new ways to improve tests, testers, and testing on their organizations. Problems, solutions, tips, tricks, and more are all on the radar for this whirlwind tour of pragmatic test innovation. Best of all, you'll walk away knowing that anyone, especially you, can be a test innovator.

Slides at Alan's web site: http://angryweasel.com/Presentations/Test Innovation for Everyone.pptx or Slides


Alan Page
Principal SDET at Microsoft

Alan Page is currently a Principal SDET (yet another fancy name for tester) on the Xbox console team at Microsoft, Alan has previously worked on a variety of Microsoft products including Windows, Windows CE, Internet Explorer, and Office Lync. He also spent some time as Microsoft’s Director of Test Excellence where he developed and ran technical training programs for testers across the company.

Alan is edging up on his 20th anniversary of being a software tester. He was the lead author on the book How We Test Software at Microsoft, contributed chapters for Beautiful Testing (Adam Goucher/Tim Riley) on large-scale test automation and Experiences of Test Automation: Case Studies of Software Test Automation (Dorothy Graham/Mark Fewster). You can follow him on his blog (http://angryweasel.com/blog) or on twitter (@alanpage).

Alan Page
A Closer look at Test Automation frameworks

May 2013

Test automation is a useful tool in software development. However, a highly effective test automation framework is hard to achieve. How do you know you have the most effective test automation framework? We (test engineers) have worked on automation frameworks that were written by someone else; or involved in designing automation frameworks. Do you have fuzzy or itchy feelings about your current automation framework but have a hard time explaining why? Is it coding style or a difference in process or approach? Does it take far too long to add a new test case in automation takes? Does your automation feel “heavy”?

In this presentation, Jae-jin Lee will discuss why those things happen and explain in detail with examples, and hopefully scratch on some of those itchy feelings throughout the presentation.

Topics (both fuzzy, and itchy) will include goal setting, (bad) re-factoring, design patterns, test oracles, ideal APIs and interfaces, test base and bootstrap code, and automation reporting.

Video of presentation.

Jae-Jin Lee
Sr. Software Engineer in Test at Expedia

Jae-Jin Lee is currently working at Expedia as Sr. Software Engineer in Test. At Expedia, he is the organizer of Expedia Test Talk series. He is a context-driven tester. Before Expedia, he worked at Livemocha.com and Attachmate as an SDET.

May 2013
Manual Automated Scripted Exploratory (MASE) testing

June 2013

Manual versus automated is a well-known continuum. Less known explicitly is the scripted versus exploratory dimension and its interaction with manual versus automated. This talk will briefly discuss each and focus on contrasting when each might be most appropriate and why, that is the context for when to use each.
How the topics contrast with Agile Testing quadrant context will also be discussed.

Business Facing

Supporting the Team

Script

Manual explore

Critique Product

Auto script

Auto explore

Technology Facing


In most contexts, you will typically need all four applying the Lessons Learned in Software Testing, Lesson 283: Apply diverse half-measures.

Slides of presentation
Video of presentation.

Keith Stobie
Senior Quality Engineering Architect at TiVo

Keith Stobie specializes in web services, distributed systems, and general testing especially design. Previously he has been Test Architect for Bing Infrastructure where he planned, designed, and reviewed software architecture and tests; and worked in the Protocol Engineering Team on Protocol Quality Assurance Process including model-based testing (MBT) to develop test framework, harnessing, and model patterns. With three decades of distributed systems testing experience Keith's interests are in testing methodology, tools technology, and quality process

Check out his blog (http://testmuse.wordpress.com) to learn more about his work.
Keith is a volunteer with SASQAG.org and PNSQC.org and a member of AST, ASQ, ACM, and IEEE. . Keith has a BS in computer science from Cornell University. ISTQB FL. ASQ CSQE. BBST Foundations graduate.
Keith keynoted at CAST 2007 and MBT-UC 2012 and spoken at many other international conferences.

Keith Stobie
Systems-of-Systems (SoSs) for Fun & Profit

July, 2013

The Market & Career Opportunities for Software Quality
Globally, the world continues to become more technical, more integrated and more complicated. “Systems”, with their human interfaces, now abound, and are much more interconnected resulting in societal-changing capabilities (e.g., cell phones and Facebook enable the “Arab Spring”). Dramatic “failures”, e.g., the recent global economic crisis and global healthcare issues, have also occurred. These “systems” of products, services, technology and humans are often sub-optimally developed resulting in poor business returns, missed market opportunities, lowered customer utility, safety concerns and even bankruptcies. Is there a way to both increase business opportunities and address these issues?

Now emergent are what are called “Systems-of-Systems (SoSs)”. These are a new name for age-old societal functions (e.g., transportation) that provide insights to optimize the interdependencies of large numbers of products, services, technologies, information and humans that together provide higher societal performance, human safety and reduced costs. SoS knowledge gives businesses, governments, professionals and even lay people knowledge not available to others and hence a market, technical, business and/or a personal advantage.

What are these SoS markets that drive business and technical requirements? What is the role of software? And what are the best practices needed to address product and services development and operations? SoSs require special skills and knowledge to better develop and operate not just SoSs, but any major product and service. This talk will review the “what”, “why” and “how” of SoSs including an overview of SE101 to SE501.

Video of presentation
Paul E. Gartz
IEEE Distinguished Lecturer

Paul E. Gartz’s career is in the application of Systems Theory to a number of domains including commercial, civil, defense, and societal programs as well as large-scale, global, Systems-of-Systems (SoSs). He developed Boeing’s first systems engineering best practices including research labs and led efforts to deploy these to aerospace and communication sectors on multi-billion dollar programs…more than half of which were world firsts. Some included the world’s 1st Antiballistic Missile System (SAFEGUARD), 1st digital airplane (767), 1st high-bandwidth, global communication service SoS (Connexion-by-Boeing©).

Paul has a high interest in living systems, how they interface with human-made systems, and especially the Global Healthcare and Earth Decisions SoSs. He was chief architect of Boeing’s entry into the Global Earth Observation Systems-of-Systems (GEOSS) at its inception. He was president of three transnational, professional societies where he expanded their fields and global markets. Paul is a member of the Boeing Technical Fellowship, a past president of IEEE AESS, an IEEE distinguished lecturer and recipient of the Harry Rowe Mimno Award.

Paul E. Gartz
Lessons Learned from Test Automation


Aug. 2013
In 15 years in test at Microsoft I've experienced 7 different test automation frameworks as architect, owner, implementer. These experiences range from various attempts at UI test automation to a non-ui framework implemented in an open source scripting language. I’ve seen test automation be effective and also less than effective. In this presentation I'll talk about those experiences and lessons learned. I'll also talk about the role context played in the successes and failures of those frameworks and approaches.

Video of Presentation

Ron Pihlgren
Microsoft
Ron Pihlgren has been a developer, developer support engineer, and for the last 16 years in various test roles at Microsoft. Currently he tests business intelligence tools and servers in the SQL Server team.
Aug. 2013
Knowledge Transfer: Preserving Your Secret Sauce

Sept. 2013

During this talk, Steve will address:

  • The aging workforce crisis, and how to retain and cross-train more than 90% of your company's unique knowledge and "secret sauce."
  • Onboarding new employees, and how to reduce ramp-up time to productivity by 50%.
  • Change management techniques for successfully embedding knowledge transfer into your corporate culture.
  • Real-world case studies of successful knowledge transfer in Fortune 500 companies.

video recording
Steve Trautman
The Steve Trautman Co.

Steve Trautman (http://stevetrautman.com) is corporate America’s leading knowledge transfer expert. For more than two decades, he has provided executives at blue-chip companies and the public sector with the simplest, most relevant and quick solutions for knowledge transfer. His pioneering work in the field of knowledge transfer and related risk management tools is now the nationally-recognized gold standard. Developed by Steve in the early 1990s when he worked at Microsoft, his knowledge transfer solution is now used by companies ranging from Boeing to Nike, Kraft to Cadbury. Steve has written two books, Teach What You Know: a Practical Leader’s Guide to Knowledge Transfer Using Peer Mentoring and The Executive Guide to High-Impact Talent Management (with co-author David DeLong). His terminology and innovative concepts in the field of knowledge transfer have been adopted at the CEO level.

Sept. 2013
Data Analysis and Telemetry

October 2013

Data, data, data… science, science, science. But how do we actually put those together? The term data science has almost become overused. We are expected to use data to make decisions, yet how do we go about doing that? Does every team need its own data scientist? What analyses should we evaluate?
This talk will address the basics of data science, and describe when an advanced analysis is required. The basic concepts that every tester needs to understand are discussed and the terminology that we use to talk to data scientists will be defined, then we will look at a roadmap of topics that are needed for every team.

Video Recording

Bob Musson
Principal Data Scientist, Eng , Microsoft

Bob is a Principal Data Scientist on the Lync / Skype team at Microsoft. He has been at Microsoft for 10 years, in various roles, including SDE manager, SDE lead, and SDET IC. He currently works as a data scientist analyzing the big data from product use. He has been doing data analytics in the software engineering process for almost 20 years. Bob has been providing analytics for Microsoft teams for almost 10 years. His title has finally caught up to the work he does.
Oct. 2013
Domesticating the Web and App Store

November 2013

We hunt information on the Web. We gather functionality from the App Store. We’ve been doing this for a couple of decades now. Anyone else tired of it or is it just me? This talk clarifies this hunter-gatherer status of online citizenry and discusses what it will take to domesticate information and functionality. Like our ancestors who domesticated our food supply, domestication of our knowledge supply is going to change everything.

Video Recording

James Whittaker
Distinguished Technical Evangelist, Microsoft

James Whittaker is a technology executive with a career that spans academia, start-ups and top tech companies. His story starts in 1986 with the distinction of being the first computer science graduate ever to be hired by the Federal Bureau of Investigation where he worked to automate Agent’s caseloads. He then entered graduate school at the University of Tennessee where he received his PhD in computer science in 1991. Following academia, James worked as a freelance developer specializing in test automation. He worked in 13 different countries over a five year period, most notably for IBM, Ericsson, SAP, Cisco and Microsoft. During this time he performed seminal research in software quality and developer productivity and published dozens of papers, patents and conference presentations. In 1996 he joined the faculty at the Florida Institute of Technology where he continued his prolific publication record and won over $12m in sponsored research. James’ work in Y2K testing and software security earned any number of best paper and presentation awards and in 2002 his security work was spun off by the university into a startup which was later acquired by Raytheon. James’ first stint at Microsoft was in Trustworthy Computing and then Visual Studio. In 2009 he joined Google as an engineering director and led teams working on Chrome, Chrome OS, Maps and Google+. He was also the keynote speaker for Google Developer Days. In 2012 James rejoined Microsoft to build the Bing Information Platform. James is known for being a creative and passionate leader and sought after speaker and author. Of his five books two have been Jolt Award finalists. Follow him on Twitter @docjamesw.
Nov. 2013

Email questions about SASQAG or this web site to: webmaster at sasqag.org

Mailing Address:
Seattle Area Software Quality Assurance Group (SASQAG)
14201 SE Petrovitsky Rd
Suite A3-223
Renton, WA 98058