Past Meetings 2010
|2014· 2013 · 2012 · 2011 · 2010 · 2009 · 2008 · 2007 · 2006 · 2005 · 2004 · 2003 · 2002 · 2001 · 2000 · 1999 · 1998|
Nine Tips to Encourage Collaborative
Collaboration among software testers effectively brings another
perspective to early software evaluation and helps create an environment
ideal for testing multi-user scenarios, permissions, security, custom
configurations, integration, and cross-platform and cross-product
workflows. In addition to providing additional opportunities to uncover
bugs, it provides a chance to share product knowledge and testing
Lanette (firstname.lastname@example.org) is quality lead for Adobe Systems, where she coordinates cross-product testing events for the company’s Creative Suites. She has 10 years of software industry experience.
Design for Testability
If you had an opportunity to build an application from the ground up,
with testability a key design goal, what would you do?
Will Iverson has been writing computer software since he was a wee kid.
Since then, he has worked for Apple, Symantec, SolutionsIQ, and Slalom,
as well as running his own consulting company. He has written four books
on software development, as well as several articles and speaking at
dozens of conferences. He currently works for All Star Directories as
Why Quality Happens - and Why Sometimes
it Does Not
This presentation is a guided discussion of why, in some places and at
some times, quality of results are assured (Quality Assurance), and
insight into why sometimes it seems that it just does (can)not happen.
Steve Neuendorf has over thirty years experience in consulting,
management, industrial engineering and measurement, with twenty five
years directly related to management, measurement and improvement of
software engineering projects and processes. Steve also has extensive
management consulting experience, BA and MBA degrees with post graduate
work in information management, and a JD and subsequent practice
focusing on business. Steve has extensive teaching experience ranging
from academics to hands-on workshops.
Stop Guessing How Customers Use Your
What features of your software do customers use the most? What parts of the software do they find frustrating or completely useless? Wouldn’t you like to target these critical areas in your testing? Most organizations get feedback—much later than anyone would like—from customer complaints, product reviews, and online discussion forums. Microsoft employs proactive approaches to gather detailed customer usage data from both beta tests and released products, achieving greater understanding of the experience of its millions of users. Product teams analyze this data to guide improvement efforts, including test planning, throughout the product cycle. Alan Page shares the inner workings of Microsoft’s methods for gathering customer data, including how to know what features are used, when they are used, where crashes are occurring, and when customers are feeling pain. Learn how your organization can employ similar strategies to make better decisions throughout your development life cycle. Alan shares approaches for gathering customer data that can work for any software team —and improve the customer experience for everyone.
Slides are here.
A tester since 1993,
Alan Page joined Microsoft in 1995 and currently is
a Principal SDET on the Office Communicator team. At Microsoft, Alan
has worked on various versions of Windows, Internet Explorer, and
Windows CE, and is the former Director of Test Excellence. He is the
lead author of
How We Test Software at
Microsoft, writes about testing on his
blog, and recently contributed a chapter to Beautiful
Testing. Alan speaks frequently about software testing and
careers for software testers.
Collaborative Problem Resolution through
May 20, 2010
Emotionally, it can be a touch and go situation in a test group. The stress of schedules, budget crunches, pressure to "get the numbers up!", and constant talking about "problems" is always a part of the landscape. A tester needs to know how to communicate effectively within this environment. A team of managers, developers, and testers need to know how to collaborate well to resolve issues efficiently. What are the fundamentals of a successful collaborative environment? What are the factors that impact effective communication? This session will answer these questions and show you how to prioritize and deal with disruptive elements. Several case studies will be presented giving you tangible examples of how to manage communications and collaboration within a project team environment.
Slides are here.
Paul has a broad range of experience in the computer industry at a variety of companies including Litton, Motorola, Hewlett-Packard (Tandem Computers), Network Appliance, SDT, WellPoint, Technicolor, GDI InfoTech, and Headstrong. Prior to moving into quality and testing, he held positions as a development programmer and manager, a project manager, and a program manager. For the last twenty years, however, Paul has been directing enterprise-wide and global quality and testing efforts spanning many industries including aerospace, manufacturing, entertainment, financial, and healthcare. Paul holds degrees in math, technology management, and organizational change. He has published journal articles and has participated in pre-publication reviews for several books. Paul has also successfully conducted business in many counties across the globe including those in Asia, Australia, South America, North America, and Europe.
LifeCycle Systems™ (LC) core model
June 17, 2010
LC is a dynamic meta model for designing systems and ultimately designing any complete high quality system. The model emphasizes the importance of Quality being built into the processes and into the products. It assures a strong emphasis on testing, verification and validation against stringent standards. Poor quality, missed schedules, overrun budgets and missing requirements are not an option.
We would say new, but LC has been around and has proved itself many times over.
Don’t just take our word for it: http://www.youtube.com/watch?v=LvuzPQ44kME
LC’s Dynamic Action Plans (DAPs) ensure that models and reality merge integrity, ability and honesty.
Slides are here.
David Holliday is Chief Architect with Human Systems Knowledge Networks, Inc.; a benchmarking company in search of best practices and those interested in using them. He is a member of Who’s Who Worldwide, and a recipient of many awards, including a Presidential Citation.
Three Reasons Your Automated Test Isn't Automatic—and How to Fix It
July 15, 2010
Despite dedicated and repeated efforts by software teams to ensure that automated tests are easy to write, run, and read, system tests are often brittle, tedious, and slow. They continue to be the biggest drain on both human and computer resources and impede the software production lifecycle. Anders Wallgren describes how the world's most innovative development organizations have come to understand how automated system test harnesses evolve and identifies three universal challenges to improving test harnesses:
Slides are here.
Anders Wallgren is Chief Technical Officer of Electric Cloud. Anders brings with him over 15 years of in-depth experience designing and building commercial software. Prior to joining Electric Cloud, Anders held executive positions at Aceva, Archistra, and Impresse. Anders also held management positions at Macromedia (MACR), Common Ground Software and Verity (VRTY), where he played critical technical leadership roles in delivering award winning technologies such as Macromedia’s Director 7 and various Shockwave products.
Agile Tweaks - Finding next steps in your Agile Process Evolution
August 19, 2010
This presentation describes ways to look at your agile process in improve quality and productivity. Key areas discussed include tooling, metrics, collaboration, getting to DONE – DONE – DONE, and generally tightening your process.
Slides are here.
Jeff Smith is an ALM (Application Lifecycle Management) Consultant at Northwest Cadence. He is an advocate for software process improvement and has been a primary process advisor and developer for Dell Computer, BearingPoint, Boeing, LexisNexis, IBM, assorted startups, and other organizations. He has a Bachelor of Science in Electrical Engineering from Texas A&M and did post-graduate studies in Computer Science and Business Administration at the University of Texas - Arlington and University of Texas - Dallas. He has almost three decades of hi-tech and software development experience with a range of organizations, industries, and roles. Jeff is a Six Sigma Greenbelt and a Certified ScrumMaster, and is also ITIL Foundations Certified, ATSQB Certified(CTFL), and Lean+ Certified. He is the current chair of SeaSPIN.org and has served on the boards of the Austin SPIN, Association for IT Professionals - Austin Chapter, and Agile Austin.
Testing the Mobile Application's Performance - a Case Study on Mobile Devices
September 16, 2010
Mobile software usage is growing quickly, and an expanding number
of consumers are expecting a consistent experience when moving from
the PC to mobile devices and back again. One of the key factors
enabling wide deployment and adoption of mobile applications is
careful usage of system resources. High quality mobile applications
must consume fewer system resources such as CPU, Memory, and most
importantly the Battery life. It is crucial that design for mobile
applications considers frugal and optimal resource usage as an
essential aspect of application design (direct porting of desktop
applications to mobile platforms is most often a recipe for
failure). Knowing and understanding the factors that influence
performance on mobile devices is extremely beneficial in order to
plan a successful testing approach. Equally challenging is the task
of measuring application performance and reporting the performance
data in a useful and actionable manner.
Slides are here
Rama Krishna has been working at Microsoft as a Software Development Engineer in Test for five years. He has been extensively involved in testing the Communicator Mobile application on various mobile devices including Windows Mobile 6.x series devices. Prior to Microsoft, he has worked as a Software Developer at Cisco Systems for four years. He earned a Master’s degree in Computer Science in 2001.
Why Agile Works: an Exploration
of Agile, Discipline, Uncertainty, and Risk
October 21, 2010
Teams that succeed with Agile methods reliably deliver releasable software at frequent intervals and at a sustainable pace. At the same time, they can readily adapt to the changing needs and requirements of the business. Unfortunately, not all teams are successful in their attempt to transition to Agile and, instead, end up with a “frAgile” process. The difference between an Agile and a frAgile process is usually in the degree to which the organization embraces the disciplined engineering practices that support agility. Teams that succeed are often the ones adopting specific practices: test-driven development, automated regression testing, Continuous Integration, and more. Why do these practices make such a big difference? Elisabeth Hendrickson details essential Agile engineering practices and explains how they mitigate common project risks related to uncertainty, ambiguity, assumptions, dependencies, and capacity
Elisabeth Hendrickson has over 20 years software industry experienceand has been working with Agile teams since 2004. She is a Certified Scrum Master, a respected thought leader in Agile Testing, a former member of the board of the Agile Alliance, and co-organizer of the Agile Alliance Functional Testing Tools program.
A Gathering of Software Prophets
November 18, 2010
Ever wonder what a century of making software experience looks like? Are there nagging questions that plague your quality and testing efforts that you wanted answered? A gathering of Software Quality professionals from diverse industries will be available to answer your questions at the final SASQAG event of the season. The SASQAG Board members and friends will be participating in a panel discussion on software quality covering both Quality Assurance and Quality Control.
There are no slides available for this event
Keith Stobie, Tom Gilchrist, Alan Page