Product Hero is our bi-weekly series to highlight outstanding members of the product management community. These industry leaders share tips on processes, team building, how to be a better product manager, and who they are outside of their careers. This week our product hero is Emily Rawitsch, Director of User Experience Design at Invaluable.
C. Todd: We’re here with our latest Product Hero, Emily Rawitsch. Emily, thanks for joining us today.
Emily: Thank you, I’m happy to chat.
C. Todd: Tell us a little bit about you and your background. How did you get where you are today?
Emily: How people land in user experience design is always a unique journey. For my story, I actually went to school for graphic design and started my career working in small design boutiques that were focused on print design and brand identity. Over time I transitioned more and more into web design, and eventually worked my way up to becoming a creative director, and I got a lot more excited about brand strategy versus layout execution.
I moved to Boston about six years ago and I landed a creative director role on a product innovation team, which opened my eyes to the world of product design, where I was creating user interfaces for web-based software and collaborating closely with product managers and engineers. I have to be honest in that, up to that point, I had no idea what ‘product’ meant or what a product manager was. It was really the first time that I heard those terms, and it took me a while to wrap my head around it. I was working for this company, which was called Affinnova at the time, they were since acquired by Nielsen, where I got a user experience research certification known as a Certified Usability Analyst.
Today, I head up the Design team at Invaluable, which is the world’s leading online marketplace for buying fine art, antiques, and collectibles. I oversee both product and marketing design, yet my primary focus is on improving the user experience and conducting a lot of qualitative research through user interviews, usability testing, and even doing some field observations to watch people use our product where they live or work.
C. Todd: How did the skills you learned as a designer and developed as a UX practitioner translate into making great products?
Emily: I worked in Chicago for a few years for a small design studio and we did a lot of brand identities for restaurants and night clubs. Instead of collaborating with product managers and engineers, we were working with architects, interior designers, and chefs. It was about designing a full experience – what the branding looked like from the moment you walked into the restaurant, down to the design of the menu and restroom signage. It was really designing a holistic system.
I think experiential design and user experience design are very transferable skills — it’s thinking about how someone is going to interact with something whether it be a physical space, a print brochure, or an app on a mobile device.
C. Todd: What’s your top focus right now at Invaluable? What are the things that keep you up at night?
Emily: My main focus right now has actually been a multi-year project that we’ve been tackling little by little. We’re in the process of moving our site from an old code base to a new one. In doing so, it’s been a great opportunity to create a better e-commerce experience that is also friendlier on mobile devices. I know responsive design is very much a given for most websites today, but we are playing catch-up.
We recently launched an improved auction registration flow that included a new, mobile-friendly responsive form and we’re already seeing an increase in completion rates by about 51 percent. That’s significant.
C. Todd: Can you talk a bit about how you do user research at Invaluable?
Emily: We do lots of user research in lots of different ways. Sometimes we’re testing a concept before it’s being developed. Other times we’re testing a concept while it’s being developed, and then sometimes we’re doing research after it’s been in the wild.
When it’s early concept testing, that’s when we’re moving fast. It’s kind of like design sprints, although in our company we call it design swats, like a SWAT team, or a group of specialists coming in to solve an urgent problem. We don’t have a lot of time because the developers are itching to go. More often than not, we’re doing this in four or five days. To give you an example of what that hyper-speed research looks like, we just did a project two weeks ago, where on day one we were exploring lots of design solutions with divergent thinking. Day two is where we regroup with the product managers and engineers to converge ideas and define what we want to learn in research.
On day three we’re finalizing our prototype, recruiting actual users, scheduling sessions, and putting together a moderation guide. On day four, we are talking to five to seven users for 30-minute sessions, and before we leave the office that day we’ve had our research debrief and have summarized our top findings in an email that is sent to the full team.
I will admit that we’re lucky we can move that fast to recruit and talk to our users. They’re passionate bidders, passionate collectors, and they love to talk.
C. Todd: Yeah, can you give us an example of something that got pushed to the back burner or even killed off because your initial assumptions about this product idea or feature got completely killed when you put it out there?
Emily: So there’s one that really blew my mind. It was probably about two years ago, when we wanted to redesign the way search results were presented on our website — it was currently in a listview. We’re similar to e-commerce sites but there are definite differences between e-commerce and bidding in an auction because of the need for a very detailed description and a condition report. These aren’t new items so people want to know if there’s a chip, or they want to know the estimated price. When we were looking at other mental models or expectations of how search results present themselves, on most shopping or ecommerce websites it’s often a beautiful gridview of images.
We knew our users love images, so we wanted to test out a gridview. Our hypothesis was that they would want the gridview because they could see more items at once, and it was more of what they expected when browsing the web for shopping. This was prototyped before anything hit development and we did it using InVision. Because the initial results were so surprising, we talked to 12 people, and ultimately they were not wowed by the gridview.
A lot of people talked about the fact that it was harder to compare and contrast the information, that their eyes could no longer just skim down the page to compare the estimates or the conditions, that their eyes had to zigzag across the screen, and that made it feel like more work. Yet keep in mind, we talked to existing users who also had a bit of bias because maybe they don’t want the site to change. They’re used to how it is. The results could have been different if we focused on potential new users.
In the end, because nobody we talked to was jumping up and down for the feature, we knew that we could deprioritize it and then focus on something that might bring more value to our user base faster.
C. Todd: How are you doing user research and testing on parts of your product that are already in development or are already live?
Emily: Testing during development was a concept I personally struggled with in the beginning, because I remember thinking, why are we doing research if they’re already building it? If we find anything it’s going to be too late to fix it. Over time I understood the value of doing research while you’re building something. If we discover something that is catastrophic, we’re not going to launch it.
But more often than not, we only discover smaller things that we can then prioritize to be fast follows after the launch. In a way, we’re almost testing to validate, and then we know what to focus on as soon as it goes into the wild.
For example, we were doing research on what we call our live-bidding page, it’s where there’s streaming video of an auction taking place and you can bid in real time. When we were doing that research, we realized that the way our design laid out on several types of monitors was giving people a false bottom. The participants didn’t realize that they could scroll down to see more information at the bottom of the page.
It wasn’t worth delaying the release because we could already bring more value to the user by launching the improved experience. This small “fold” issue wasn’t going to make or break it so we launched anyway, but then two weeks later we fixed it.
C. Todd: Cool, and the last thing you mentioned was testing in the wild, so it’s already out there, what do you do? How do you deal with this?
Emily: There are a few things that we do in the wild, and one is usability testing. Most of the research we do is moderated, so I am talking to a user and we are sharing screens as I watch them use our site. Our users are around the world, so that’s why we do it remotely. Every now and then we do unmoderated usability testing using something like UserTesting.com, yet it prevents us from asking probing or follow-up questions. Usability testing is not meant to be statistically significant, and it took me a while to explain that to my company. Talking to five to seven users of the same persona is enough to catch the big rocks. After you talk to about five people, the patterns start to emerge.
If we want to get statistically significant information in the wild, we can do some A/B testing. We also look at site analytics to track how things are performing. The exciting thing about research in the wild is that we can get much more accurate information. Our product managers really spearhead thinking about how we are going to measure the success of a project and defining those key performance indicators, or KPIs, ahead of time.
C. Todd: How do you decide on KPIs to focus on? What have you found as being the most important to pay attention to in order to design a better product?
Emily: I will say that this is an area that we are definitely getting better at. When I started at Invaluable we really didn’t track a lot, and four years later we have made drastic improvements in the visibility and information we have. As for what to track, that really is more owned by our product team versus the design team, and we’re trying to do a better job sharing the results across the company.
We’re always keeping a close eye on what users are clicking, completion rates, or where in the process they are dropping off.
C. Todd: That’s really great. How do you know when your visual design is really good? How do you track that?
Emily: On the qualitative side, one thing I like to say is that good design is invisible. When we launch something and it’s not noticed, that is the biggest compliment. Hopefully we’re releasing changes so frequently that it just feels natural. I find that people tend to notice things more when it doesn’t work for them versus when it does.
For example, as we launch these responsive pages, it’s not like users are thinking, “Oh my goodness, it’s finally responsive!” To them, they probably didn’t even notice because they just expected it to be that way, so good design is invisible. We also talk to our users on a regular basis — our cadence for 2016 has been doing a user research project about once every two weeks. We are constantly getting feedback, asking the questions about what they love about Invaluable, or why they might prefer using a competitor.
C. Todd: How does your engineering, product design, sales, and marketing teams all work together? Talk to me about the relationships between those different teams.
Emily: When I started at Invaluable in 2012, the process was a classic Waterfall assembly line. Product wrote pages of detailed functional specs, Design created wireframes and high-fidelity comps with detailed annotations, we tested static comps, and then everything was passed off to Development. When the coded design came back months later, it did not match the provided comps. This led to months of back-and-forth QA, where most communication was done in online project tracking software.
Today we have a really strong collaborative relationship between Product, Design, and Engineering. Low-fidelity prototypes are created in InVision to provide just enough information for early concept testing to inform product and design decisions. At the same time, a programmed prototype is developed quickly using a coded style guide. Research focuses on high-risk, high-effort areas and developers are encouraged to observe sessions. Findings are integrated right into the coded prototype and undefined areas are flushed out by a designer and developer working side-by-side (some call this co-design, we coined the term “co-devign”). Rather than all QA hitting at the end, the designer and developer refine tweaks together along the way.
Just like you iterate design, you need to iterate the design process itself.
As for Sales and Marketing — it comes down to building good relationships with transparency and communication. Our product managers hold regular stakeholder meetings to keep everyone in the loop on progress against the roadmap. I also have weekly meetings with Marketing leads to make sure our priorities are aligned. You have to keep an open mind as priorities will change, and we simply have to adapt and be flexible.
C. Todd: Thinking just about the product design space, what’s missing from the conversation today? What’s not there that should be there?
Emily: I think one thing missing is honest conversations about how processes are really working within companies. I guess what I mean is that a lot of people will jump to ask, how does your team work? Are you guys Agile or Waterfall? I always joke that we’re Agile-ish, with emphasis on the ‘ish.’ I’ve even had people tell me that you can’t be Agile if you have fixed deadlines, but at the end of the day we’re a business and deadlines are part of that reality.
I think a lot of people want to attach labels, but instead we need more honest conversations about what we’re actually doing and if it’s effective or not. There is not a one-size-fits-all solution out there, but by talking about real challenges with real constraints, we can help each other iterate and improve our processes.
C. Todd: What advice would you give to someone trying to break into a product design role?
Emily: My best advice would be to get out there and network at design events. To me, the point of attending conferences and events is that you are there to meet people you don’t know. Yes, it can be intimidating to talk to a stranger, especially if you’re new to the field or worried that, “Oh my gosh I don’t know enough” or, “These people know more than me.” Everyone can learn something from anyone, so I think part of it is acknowledging that it can be uncomfortable but doing it anyway. Those one-on-one conversations are so valuable to learn how other teams work and to form relationships for future opportunities.
C. Todd: What exciting things do you think are going to happen in the next year or two for product?
Emily: One thing we’re starting to talk a bit more about at Invaluable, and I hear it come up a lot, is going toward machine learning. As we move toward the future for product design, it’s really going to be about creating custom curated experiences for individuals. For example, let’s pretend that you know I’m an avid collector of antique sock monkeys, which is somewhat true. Why do you keep sending me emails recommending oil paintings? It would be great to personalize more experiences with content and timing. When I talk to users during research projects, they say things like, “You should know who I am, you should remember me.” They want and expect a personalized experience. They want to come to the site and be treated like an old, familiar friend.
C. Todd: Cool, very, very cool. Thank you so much Emily, this has been absolutely fabulous and really insightful. Appreciate you sharing your experience with the world, and thanks again for being a Product Hero.
Emily: Thank you for having me!