Live Blog: Selling (and Buying) "Live Site Quality" at eBay, Jon Bach, STARWEST 2013

Beth Romanik's picture

How many testers are so concerned with their product's quality that they seek out direct feedback from users?

At least one: Jon Bach, director of live-site quality at eBay. Not to get ahead of myself, but later in his presentation Jon shared some eBay user feedback he regularly wades through, and while some was quite descriptive about glitches observed, much of it was basically what you'd expect. One comment said "SITE IS FINE—PROVIDED IT WORKS" (yes, in all caps), and another simply said "please stop!!!!!!!!" (yes, with eight exclamation points). But if it could potentially be a bug-related issue, the comment is investigated.

So, what is it that Jon does? He teaches eBay testers about bug investigation and how to avoid some common mistakes—in his words, he collects problems. 

He started off the keynote presentation by opening eBay.com in five browsers and doing quick confirmatory checks that everything is displaying and functioning the way it should. He said that especially in the world of DevOps, executives usually just want to know if everything is OK right now. They’re empowered by the tools to ship immediately because if something’s wrong, it's easy to fix it quickly later on. He listens in to what Operations is doing so that if something is wrong and someone asks “How did QA miss this?” he can try to find the root cause for what happened.

Jon said he's trying to become "the" guy to go to in his company any time "something interesting" happens on the live site. To get ahead in testing, he recommends to do the same: become well-known, get issues funneled to you, and know who to go to for specific tasks. He said he tries to keep an open discourse with other departments so that they can learn from each other, which is facilitated by the fact that we works right in the middle of the Operations, Customer Service, and Engineering departments. By just chatting with coworkers, they will tell him, "Hey, I think you just gave me a solution, or at least something I can try." And that's a big component of attending the STARWEST conferences: forging connections and bouncing ideas off colleagues so you can mutually benefit.

One of the slides that seemed to particularly capture the audience's attention (along with some smartphone photos) was titled "Is it good enough?" By "good enough," Jon doesn't mean a mediocre product. He means that the quality is such that any further work would be unnecessary, or even detrimental. Of course, below a certain point, unacceptable quality means further time and effort is crucial. But above a certain level of quality, further time and effort would be a waste of resources. Jon says you should get to quality, then stop. "For example," he said, "I'm going to talk for seventy-five minutes, then stop." 

How can you tell when you've reached that level? Sometimes, it isn't an easy call. But Jon shared about 20 heuristics eBay considers to determine quality. Included in the list was bad user experiences, survey results, and overdue bug fixes. They even regularly check the #ebaysux hashtag on Twitter to read customers' complaints.

If some of these methods seem unconventional, well, that's the point. In one of his several soundbyte-worthy comments from the presentation, Jon said he's very interested in magicians and slight of hand. "Everyone else is looking at the hand he wants you to look at. I want to be the one looking at the other hand," he said. "Let's look in uncommon places to figure out what's happening."

One of these uncommon places was on eBay Radio, a podcast mostly for the benefit of the site's sellers. Jon was invited to go on the show and respond to live call-ins about glitches users experienced. He was skeptical at first about what he was going to get, but he ended up receiving some interesting insight from customers. “I felt like a back-country sheriff whose next-door neighbor claims to have seen a UFO," Jon said. "‘Was it swamp gas? Was it an airplane?’ ‘No, it had purple lights swirling around!’ Oh. This really is something, then. So, what kind of UFO do we have?” So he calls reports that seem like they should be easily explained but end up revealing a legitimate bug "UFO reports."

The eBay Radio chat went so well, he went back the next month, and the next—on the fifth month, he was pleasantly surprised to discover that he got his own theme music and a title: The Bug Hunter! “And I wept. I was so honored,” he said, to much laughter from the audience.

Crowdsourcing opinions like that gives different context to issues customers are having that he may not have had the occasion to check himself. He said he got to talk on the phone with a grandma type who was having trouble with the site. After some questioning, he found out she had three browsers running, with twelve tabs open in each. "Are we testing for that?" Jon asked. That’s how some people will be using eBay, so we better test for that. 

At the same time, though, there has to be some kind of triage happening. Everyone says “If it ain’t broke, don’t fix it,” but what means “broke” for you may not seem “broke” to me. “Yeah, I know you want to see all the bugs fixed. So do I," Jon said. "But that comes at a cost. How about we work on the top ten?” By asking for the top ten, you can gain some powerful context about the issues: For instance, they may not all be P1s—maybe there are some P4s the team absolutely wants to be fixed that week.

Here's a Jon Bach idea for free: a lunch activity featuring a test scenario with Legos. He assigned red Legos login bugs, yellow Legos for search issues, and green Legos for categories. Give a table of ten employees a pile of Legos and have them go around the table twice, with each person adding the color of their choice to the creation each time. You have to write a test step relevant to each color of the Lego creation, so you end up with twenty potential bug tests.

And you should be able to come up with the twenty test questions; you just have to think creatively. What time did the glitch occur? In what part of the country? In eBay's case, an important question is always Where was the bidding and the selling? Did this happen during the midnight callings? Did we just push code to production? Is there any other process happening? As always, context is everything. (A recurring saying during STARWEST 2013 has been "It depends.")

My favorite part of the presentation came when Jon addressed many testers' fears that they'll be forced to learn to program. “If I can impart some kind of hope for you," he said, it's that "maybe executives will realize programmers just want to program. Testing—the act and craft of imagining possible ways software can fail—leave that to someone else." I liked it because he then compared programmers and testers to two positions close to my heart: writers and editors. Though the two have to work together, there's an order to their jobs, and one person can't necessarily do the job of the other. “You write. You go write. I can edit when you’re ready,” Jon said. While a disparity between writer/editor and developer/tester exists, he said the wall doesn’t have to be so high. Roles and responsibilities are important, as is knowing who you can go to for a certain task, but developers need testers (like writers need editors), so the two don't have to be so separate.

Jon concluded the presentation with a short burst of ideas he uses to monitor quality. First, check out the competiton. It doesn't have to be in depth; he said that every once in a while he'll go to Amazon's and Etsy's sites to see what they look like and if they're doing anything differently. Next, test site readiness. What if a large seller wants to have a flash sale? Can the site handle it? Find out. There's crowdsourced testing, of course. And dogfooding to check end-to-end flow testing. And there's the site's reputation, or, as Jon's brother James Bach calls it, reputation collapse potential. Do you see anything annoying that would drive a user to one of your competitors? Fix it or change it.

And speaking of dogfooding and reputation collapse potential, Jon thinks he may have decoded the mysterious "please stop!!!!!!!!" feedback. He looked at the timestamp of the comment and what the user was doing and determined it was made while entering shipping information. When he explored that part of the process from a user's perspective, he discovered that eBay asked for your ZIP code three times. Could that have prompted the comment? Maybe. But either way, the process was streamlined. Now that's some dedicated quality assurance.