Back to Blog

Data-Driven Design with James Aylward of Pluralsight

Author

 

James Aylward circlePluralsight is looking to change the way the world learns, and they are using data and human-centered design to do it. That may sound like a big goal, because it is, but I wouldn’t count them out.

James Aylward is the Head of Data Product at Pluralsight. He leads multi-skilled teams that are part of the growing Data Product Hub at Pluralsight. He applies his extensive background in product and data science to developing Iris, the IQ assessment tools and the overall product experience on Pluralsight. Driven by human-centered design and Pluralsight’s mission of democratizing technology skills, he’s always looking for ways to advance the user experience through data science.

Anytime I can talk to someone at Pluralsight I do it! From Nate Walkingshaw to Gil Lee to James and pretty much anyone on their team, I always walk away feeling inspired to up my product experience game, and this was no exception.

Listen to my interview with James as we talk about:

  • His journey from Austrailia to product leadership in the US
  • Directed Discovery and seeking context when making decisions
  • Using qualitative and quantitative data to ship daily
  • Data-driven and human-centered design

Listen to the show:

Show Notes:

Transcript

Heath: Joining me on the show today is James Edward. James is the Senior Vice President and head of data product at Pluralsight. Welcome James.

James: Yeah, thanks. Nice to be here.

Heath: So James, you’ve been at Pluralsight for four months, is that right?

James: Yeah.

Heath: But I’m going to do what I normally do, which is backup and talk a little bit about how you got here. Tell me a little about your journey and the product and how you got started.

James: Well, it’s always a circuitous route into product, right? I mean, I guess I started life as a really, really bad Visual Basic/Java developer, way back in Australia in Melbourne and quickly realized I wasn’t very good at developing, I didn’t go to school for that. I went undergraduate for business and I got an MBA, but what I did get out of that a couple years is a real appreciation for what’s possible and that time was sort of early days of the internet and what we could do and how we can innovate, and basically how development works but as I also saw … Started to see a huge gap between back in those days it was waterfall like it was … I was working for a big financial services firm.

You didn’t have to be rocket science to work out, “Hey the customer is missing here, where is the end user and all this”. So I quickly started developing and got out of the engineering developer to when I moved to the states, came over here worked a little bit in some data stuff with Harvard medical school and then eventually E&Y and tech consulting in the E&Y world and then really got the big break at Staples where I started their innovation idea exchange, was the innovation program for all the associates in Staples and Staples had I think like 90,000 Associates at that point.

And bringing those ideas from the front lines to actual products was something else I was fascinated by, so that’s where I started to get on that journey and get that real energy for product and launching products and being involved in the product creation process. Understanding what customers want and prototyping building and launching. So from there the big recession started. Thought it was a great time to go get my MBA, went back to Sydney went AGSM, which is Australian Graduate School of Management and then came back and exchanged to Dartmouth to finish my MBA.

All throughout at Staples, I’d heard all this stuff about Vistaprint. And I was like, “Oh, man, they’re doing something, right”. You could just tell by the metrics, they were updating us on copy and print in every category. So what was the magic there somewhere? So I went there, I really found that that was my I guess my real product learning school was Vistaprint. And it ran very similar to the lean startup, the book everything was AB testing, huge amounts of data in every product and it was a really fun combination of physical and digital products coming together. And also a whole bunch of algorithms involved as well. Like what do we display to the customer inbound, and what is the returning customer experience look like different to a new customer experience, which in those days was really groundbreaking.

Everything was AB test. And there was a ton of autonomy given to people at a very early stage in their career. So I learned a lot about data driven processes and how to build products that we have a high confidence in terms of revenue generation, and things like door testing and demand testing through data driven experimentation at Vistaprint, which then led me to Gazelle. My boss from Vistaprint went to Gazelle, and she said, “Hey, you gotta check this out, it’s a really high growing startup”. And it was, really pioneered the iPhone and Macbook and Android for that matter, resale market and at the time was one side of market, and I came in we made it to two sided markets, we were both buying and selling used iPhones and Androids and again, the math and algorithms behind getting those markets to match is huge.

So I had a real appreciation at Gazelle how we can drive both sides of a two sided market to make a real robust business out of that. And then I guess the lightest age was at Fidelity labs, I went over there and founded the artificial intelligence incubator at Fidelity. Looking for places we can drive product development, but also make ourselves more efficient in terms of using AI and ML techniques throughout the firm. And it’s a huge place. There’s a whole bunch of different attributes of that, that we could dive into. But it was a lot of fun establishing the AI incubator and growing it from I mean a conference room to about 24 people, 25 people working in cross functional teams, that look a lot like the cross functional teams here Pluralsight and some of the architecture of both the systems and the people organization at Pluralsight that really attracted me here.

So that was a long journey to how I got to where I am today, but it felt like along the way I was learning different attributes of a product development that have really set me up to be able to provide leadership here today.

Heath: Vistaprint, I hear that name a lot too. And I think VentureFizz sometimes does these, I don’t know if they call them retrospective, but “Where are they now?” kind of thing and I always think it’d be interesting to have them do that with Vistaprint.

James: Oh, my God the Vistaprint network is deep and rich, and it’s spawned a whole bunch of other companies like DraftKings is probably the biggest example of that. It’s interesting, that mindset. And how it creates companies and Boston needs that, we need those sort of anchor companies that train people up, and then go and spread out and make it richer and deeper ecosystem. And that’s kind of that philosophy I’m trying to bring to Pluralsight today in Boston.

Heath: So you’re new to Pluralsight four months or so in. How have you gone about attacking, 30, 60, 90 day indoctrination into Pluralsight?

James: Yeah, look, I’m thrilled. I’ve never been happier joining a company ever. This has just been an amazing experience. There’s so much that you expect to have to sort of work out, as a new leader coming into a position, usually you’re looking at vision mission statements, and then trying to work out how do we structure the teams and other teams have the right makeup, have we developed the right culture. I’ve been amazed by how well that’s been ironed out already, we all know what we’re here to do at Pluralsight, we here to democratize technology skills everybody knows that, and it permeates the culture.

We have core values that … It’s funny because I tell it as a joke but I left Fidelity and then was walking on the street after had a bit of a party and send off there, it happened to be the summer party here, so we were walking down the street to get on a boat for the summer party at the Pluralsight, as a joke I rang my wife and said, “Hey on between jobs right now”. But got on the boat and people would just come up to me and talk about the core values of Pluralsight, but unprompted I’ve been asking about, I was trying to meet people, but they were sort of talking about hey, like seeking contexts a big deal here. We all have ideas and thoughts.

And people throw them out there. But instead of just judging them on face value, we then go and ask for more context and understand what was the sort of factors that led up to that decision, before then making a judgment or making plans beyond that. Another one is accountable for excellence and committed to something bigger. So at Pluralsight we have a huge and thriving business with B to B generally, some B to C on how to train people up from their current skill set to the skill set that they want to have. But we also have a Pluralsight One developing as well, which is nonprofit, which we want to be able to democratize technology skills across the world for … And we’re starting to do that with providing more access to computer science teachers throughout high school.

And also making our product available for people as they leave sort of the high school level to be able to generate, to go up into professional tracks using Pluralsight to be able to get skills that are in demand today. So this is real deep culture. So that is attached to the vision of the company and the cultures. The architecture of the team is amazing. So each one of these teams is small teams that have product people, designers, architects, and data scientists as we need in the team, and they all have the autonomy to ship to production every day. They also have this amazing … It’s based on lane, but it’s not quite lane called directed discovery, which is our process of using qualitative data.

We’re increasingly using more quantitative data in this is approach but our small teams are totally authorized to ship to production, and start acting upon the insights that they’ve gained from the customer discovery, so each one of these teams can run and do and learn and quickly get up to speed on where they need to be. So you take away into count. I’m like, “I don’t have to come in here and do the sort of 90-day playbook. I’m ready to start innovating”. I’m listening to what the team … Their ideas, they’ve already got on track and how do I increase those, and how do I accelerate, what roadblocks can we remove if any, and how we get to work. So it has been amazing for me coming in. And I’ve had one on ones with pretty much everybody in the data product organization.

They’re all … It’s very similar. It’s just, “Hey, this is the best place I’ve ever worked”. And that sounds like total propaganda, but I’m not lying. It’s been so refreshing and so cool to come to a place where we all have that shared sense of mission.

Heath: I guess we can blame Nate or give Nate credit for all that.

James: Completely, yes. Absolutely.

Heath: So on the Directed Discovery front. So clearly with that approach, it seems like you guys have the customer at the forefront, you’re getting context around everything you’re doing. Are you tasked with adding in and upping the data aspect of that? Tell me how machine learning, AI, data science, how are you guys infusing that into the process?

James: Yeah, there’s a couple different ways and we are … I mean, don’t get me wrong, Pluralsight’s been using data for years. But we’re trying to systemize it and prioritize the use of data within directed discovery. So I mean, there’s lots of different ones, we’re still in the draft phases of it. But the way I see it is we got voice the customers, huge element of how we do things at Pluralsight. How do we augment that with the voice of the customer as revealed by data, so people will tell you stuff, but you couldn’t tell me what buttons you clicked on during the experience or what search terms you put in 30 days ago, but through data deep dive, we can see that, so we can bring out more of the voice of the customer in the moment the revealing tension of the customer as they came to the Pluralsight experience and what they really look for.

We can also identify the people we’re talking to, on a qualitative your point. Are they representative of a broad customer segment within pluralsight? Or are they just sort of a niche that we’ve been able to gather that particular moment, and be able to be more scientific about who we talk to in terms of qualitative discussions. That’s, one. It’s sort of the discovery viewpoint. Another element here is, as we have one of our core values is creating with possibility, data science gives us more colors in our palate, we can do more things with data science than we ever used to be able to with the sort of more traditional software development processes.

So given that, how do we prototype and use creative possibility to build experiences that we hadn’t thought about before. And that’s actually a really interesting mix because most customers don’t have a true deep understanding of machine learning and AI techniques. And even increasingly, we’re coming up the learning curve ourselves in terms of product and design. So how do we identify what’s possible in terms of product design world and work closely with data science, to see and be able to develop those predictive experiences that only we can develop. And that we need to develop quickly in order to build experience that nobody else can touch.

So there’s the voice of the customer. There’s new prototyping. And then when we’ve developed a product, how do we get deep with the analysis and tracking of that product, and that’s more than just saying, hey, did the product work? Or did it pass the test to see whether people gave a thumbs up or down? Did it achieve its objective? Have we harvested the right metrics? And are we looking at the right AB testing, for example, if it’s working the way we imagined it was going to work to begin with, how do we do that? And so bringing that data back in that feedback loop, only makes the whole virtuous cycle go much quicker.

Heath: How are your products team and product teams leveraging this new frontier. I mean, it’s not that new. But to your point, it’s been there. But I think what’s new is people actually making practical use of the AI and machine learning. How are your team starting to use that?

James: Yeah, so there’s a number different ways. One is, after any learning module happens, we are recommending the next best action for you. So it’s gone from a rule based to sort of a more of a machine learning based approach. And we’re about to launch that more heavily in production. So that’s one angle. Another one is, and it’s always been a strength of Pluralsight is our assessments. So instead of having to do sort of 30 or 40, 50 questions to work out where you are in any one skill, we can cut that way back to sort of 10 questions. And that’s all because we can impute from, a history and a huge amount of data that we’ve had from previous learners to work out, quickly were you on the proficiency curve in any one skill. So that’s one angle.

Another one is we’re starting to identify, “Hey I look you looked at these elements in the content library you are able to pass on this skill, we kind of think you’re in this role”. So we build into role IQ and identify and we’ve been able to predict, “Hey, you look like you’re going to be a big data engineer or you’re on the Azure track, because you’re using a whole bunch of Microsoft Azure products”. And be able to walk people up through various skill IQs to being able to just have this role IQ and we’re able to say, “Hey, this guy’s a data scientist let’s get him or her up that curve through these different skills and the next level of content to where he need to be”. So what we’re trying to do is shorten that path from where you are today, to where you need to be in terms of skills. So a little bit different to Netflix, where Netflix are trying to get you to consume more and more content.

We’re actually trying to get you to consume the optimal amount of content to get you to where you really need to be, similar algorithms, similar math, but just a slightly different tweak on the match between learner and content. So to me, it’s fascinating.

Heath: It sounds like in one way you’re trying to reach into the talent pool earlier to identify strengths and opportunities. Perhaps instead of saying, “Look, we’re not going to recommend something until they’ve achieved level 10”. We’re going to say can you look at two achievement scores, for lack of a better phrase and identify, look, they both scored a 90 on this proficiency but this guy’s really more fit for this role and this guy is really more fit for that role, and then level them up from there or is that kind of the idea?

James: Yeah that’s I mean that’s what we’re going to be able to get to it pretty quickly is that level of intelligence. That’s another reason why I’m so excited to join Pluralsight is that we have these data assets that really I don’t think anyone else can really touch in terms of the marketplace and landscape is, we can actually be the authority on how you do that and the data assets that we have some of the most exciting things about Pluralsight is, they’re rich and deep and I we’ve been very cognizant over the years of retaining a lot of that, so we’re able to be able to help guide and optimize that learning curve and it’s not just “Hey everybody else on this track does this”.

It’s “People like you have done previously these steps to get to this outcome you should too or you should consider that too”. It’s basically the masses is trying to get to, and in so doing we’re providing the sort of wisdom and guidance of all those winners that have gone before you to you, to do … We’re not prescriptive about it, we provide that suggestion and most the time it seems to be pretty accurate. Now it’s only as good as the data coming in. So that’s really where we, when we start talking about machine learning and AI, everybody wants to get into the algorithms, everyone they talk about neural nets and LSTMs and a whole bunch of different algorithm abbreviations. But it’s only as good as that data. So when we start imagining and studying Kramer possibility, we’ve got to understand the data set and what we have, what’s available. How can we augment that data provide the service that we need.

Heath: But it’s the biggest challenge pose of the collection of data is so much as is, “Okay, what do we … What are the algorithms that will come up with most accurate and useful answer than the collection piece?”

James: Yeah, actually, I mean, having more data is actually statistically if we’re not saying from a whole bunch of research is more important than which algorithm you use. It will drive more efficiency and more proficiency in the end recommendation then will hunting through all the different possible algorithms you could use. Yeah, you need to be the right ballpark. But if I had a choice between more data or a better algorithm, I’m taking more data. Yeah, it’s really interesting because the more training examples you have in general, I mean, I’m talking in generalities here, but the more training data you have, the better your resulting model will be.

Heath: Would you say your focus is more on the individual who is trying to assess and level up their skills? Or is it on the employer market trying to identify the best talent or is it equal?

James: So the answer is both? And the answer is actually, yes. And we’re also thinking about that content community as well. So the people who write the content and author the content, how can we provide them with the right data as well. So it all works in a really virtuous cycle. So if we can make the one as journey so much more efficient and optimal that improves the employer experience as well, so they’re trying to upscale and it’s a massive challenge for the technology companies and every company around the world, because most companies are becoming technology companies, is how do you take the people within your organization who exists today, and upscale them to the skills that they need to have to tackle today’s environment or in the environment 10 years from now.

It’s a very difficult challenge. And again, you need that sort of personalization at scale. So if we can make the learning journey better, we can make the technology leaders journey better within the organizations and if we can feed that feedback loop saying, “Hey, this content went really, really well”. Or we have a content gap in this particular technology to our content community, they can build the right content for helping everybody come up the learning path. Our data products have to run across all three of those personas. And we think about that every day.

Heath: What’s the biggest challenge for you here, your team, your organization.

James: Yeah. So it’s focus. And it’s my favorite challenge, we have a world of opportunity, what do we do first? And what sequence? In what order? And how many resources do we assigned to each one of these priorities, which is the classic product? Product design challenge. We see a whole bunch of different opportunities. We’re currently working on search technologies, so our search is great today, but we want to make it so much better because that is the revealed, that’s kind of the transmission of the whole product experience at the moment. We want to be able to provide that incoming liner with the exact right content. And then from there, how do we recommend the next best action. So search recommendations.

We’re also we’re always looking at our assessments, trying to make those as optimal as we can and really doubling down the core experience to be able to make that learning journey so much more optimal and be respectful with the really limited time that people have to learn, and then, as I said, building more visibility for the technology leaders within companies, and also providing the right level of feedback and data to our author and content community.

Heath: Qualitative versus quantitative. That strikes me similar to the question of data driven versus human centered design, how do you strike the balance between those?

James: Well, directed discoveries built entirely around human centered design and it’s always going to be and we’re not going to change that. We will always have a very deep respect for qualitative understanding of discovery. And as I said before, the quantitative stuff is really augmenting that and bringing more of that Leonard’s journey to light, through the data exhaust that people have as I go through the experience, and the sort of signals that can provide through what such as they do, and what buttons they click, in order to be able to provide a better and more rounded picture to that human.

So we don’t see it as data exclusive, we would not want to do that. It’s data with augmented or data enhanced design. I don’t think anyone’s come up with a snappy algorithm, I mean, sorry, snappy is sort of acronym for it. But I think there’s a big opportunity in the market to talk about how we, as product practitioners are able to build out this really deep and rich picture of the people we designed for.

Heath: But it does seem like machine learning and AI are starting to be able to establish some of that human side, that intent through the data. And I think that’s where a lot of the power potential is for these tools, to be able to say it. We’re not turning our back on the qualitative, quite the opposite. We’re turning up the emphasis on the qualitative, we’re just allowing ourselves to access even more of it more readily, to either the existing data or through new data were collecting. Because we have the ability to reach in and grab that information.

James: Completely. And there’s a bunch of different ways of doing that. It’s more than just understanding the person we’re talking to at that point. It’s also like, “Hey, this person gave us this, this insight, is that representative across all customers is like, how many people have the same challenge?” And then we can say, “Oh, great, let’s go fix that because it’s affecting 93% of our customer base”. Or actually, it’s only really a niche product problem for this one person, maybe we can think of other ways to sort of do that, on a onsie-twosie basis rather than building out any product for it. So it’s going to help us with conserving and using our resources in the way that best fits our customers.

James: Which is ultimately what we’re trying to do in any product decision is, is it worth allocating resources to this problem or to another opportunity.

Heath: So what are some of the pitfalls to keep in mind when you’re using data, I mean you’ve got this data science background, clearly been a big part of what you’ve done, what are some things to look out for?

James: There’s a ton of things look out for. As I said before, your algorithm is only going to be as good as the data that you’ve collected, so you’ve got to really think to, where did this come from, and is the data accurate and doesn’t have the right sort of volume and depth and velocity to be representative of whatever experience you’re trying to predict, or you’re trying to gain insight from. So that’s a big pitfall, if the data has any sort of bias in it, the algorithm will reflect that. And we’ve seen plenty of examples of that in the market that you’ll have a skew. So being able to understand any bias in the data is hugely important, something we take very seriously. But the upside from that is, we can use synthetic data to correct that bias which is actually a lot easier to use data the correct any bias than it is to potentially on bias.

Big sounds forces or teams or whoever it is. And it I’m not talking about necessarily malicious bias. I’m talking about bias for that people have from just being NGO, being who they are, the way they look at the world, whatever it is, and of course, malicious bias, we should probably get rid of that too. But it’s being aware of that to be able to understand what data you have in your a data set, is it skewed? And is your data complete? What are you missing? Because we’re not gonna be able to capture all the data all the time from everything, and do we have a way to identify where the gaps are, and then be able to make up for those somehow.

Heath: So, what’s most exciting in your space, in the Pluralsight for you?

James: Yeah, I mean, to me, it’s, really we have an opportunity here to change the way the world learns like, I don’t think anybody else has at least in terms of technology skills, the data assets that we have. So to me when you’re looking for a challenge, and I talk to data scientists all the time, and I talk to product people and people who want to get into data design, and look, what do you want to do when you get when you get up in the morning? Do you want to do something that’s truly impactful? And to me, the answer at Pluralsight is yes, you can change, if we crack the nut here, if we really understand how to be predictive and be able to identify the optimal path for anybody to learn.

We can change people’s lives doing that, you can get people to understand concepts and learn coding languages or design methodologies or even management styles that unlock potential for people or not just talking big businesses here, we’re talking about anybody around the world and we’ve seen that with Google made a video it’s on YouTube, about the guy in India who was able to learn, I think it was JavaScript, I’m not sure on the language. But now he’s got an internship at Google, it totally changes his life and changes the life of his family. So, if we can take the data assets that we have here today, be able to use the huge and rich and vast content libraries that we have at Pluralsight, and be able to provide the next best opportunity for people to learn, that changes people, it changes the way people learn.

It’s exciting to get up in the morning and think about that. We also have people here from like the MIT Media Lab and who’ve worked there previously, I was talking with one yesterday in our team who really understands the white people learn and the teaching methodologies, and we marry that with our data sets. And it’s not just about using data to drive recommendations. We’re also trying to use the human understanding of … And it’s good example here from that. We always had a Vistaprint was, we would make new content all the time and the new content never sold as well as the old content, was kind of like a band having new songs, you sort of have to see them every now and then.

Heath: Play the hits man, play the hits!

James: That’s exactly right. So if we just always showed what we’re most popular all the time, we’d soon become stale because the landscape changes. So that’s one of the cool things about the challenges we have every day, is how do we bring in that new stuff? And how do we white that new stuff in terms of the old stuff, to be able to provide the optimal pathway because the pathway will change. And that’s what we get into some really cool discussions about LSTM neural nets and other sort of units that understand recency, so we have cool, cool conversations about that. But really, we’re talking about changing the way people learn so-

Heath: And what is it about the old stuff that people like? Is it just the old stuff or just simply the way it was packaged and delivered?

James: Yeah. Or is it just the volume? It was cool, three years ago but now things have moved on but our algorithm hasn’t, like we need to be very aware of that and be able to make our algos change as trends change. So that training data is enormous for us too. And again through the search boxes that we have within pluralsight, we can and we’ve got a technology right our index out there already, we can start to understand how the markets moving in terms of technology skills acquisition, which is super fascinating discussion to have with technology leaders around the world.

Heath: As one who’s been in product for a long time, either on the design side or the product management side or in my case, the marketing side, I’ve made plenty mistakes and one of those when I was a product managers, I think was this lack of using and or seeking data to drive decisions, product decisions. What’s a mistake that you’ve made in your career and product and tell me a little about that.

James: Yeah. There’s two ends of that spectrum of my mistakes. One was … I mean, I think there’s a danger going too quantitative and I’m not sure … Vistaprint was a good example, we had the free business cards and the quality of those business cards was, like we could go to the printer here in the office and make a better business card like the paper was … And this is no disrespect to anybody at Vistaprint, I love everybody there. But the paper quality back then was just awful for our free stuff.

And we would do all the hard work, we’d do all the marketing, we’d bring the people in, we’d sort of give them the free business card and then we’d be surprised when we studied the data and they didn’t come back and like why aren’t they coming back? The answer was pretty clear from a qualitative viewpoint, the paper stock was pretty bad so we’re able then to say, look … And this is actually where data set us free, was able to build a business case there and say, “Look if we just moved from our absolute base substrate to the next best one, that’s a million bucks a year or something, on a huge product line for business cards at Vistaprint, let’s just try it. Let’s take that qualitative and forget our quantitative roots just for a little bit, just to do an experiment here”.

James: Lo and behold, it worked, right? So we’re able to switch everybody up to the next best substrate. And that was more than paid up for, in repeat, visits from first time users. So that was the example going too quanty and then no qualitative, bringing the quality and the voice of the customer and Vistaprint went through a huge learning curve to sort of bring in qualitative information.

Heath: Well, it’s the whole what, versus why, we know what’s going on, we don’t know why.

James: Yeah, exactly.

Heath: So let’s ask.

James: And the reverse is true too. So in other places where we went so deep on qualitative stuff, like we talked to everybody, we did user journeys for forever-

Heath: But you couldn’t see any patterns from that.

James: Yeah, and I think because … And this is always a problem for any product, I think it doesn’t matter what methodology you’re on, if you take too long to get to a prototype, you don’t get that feedback, that data feedback from real customers who are in real time quick enough. And what happens is, I think team sort of clam up, and I don’t want to release the prototype. Or we end up talking about the potential UX or the potential data design for forever, when really the customer didn’t want the product in first place. So how do we discover that quicker, if we aren’t using a prototype to generate that quantitative data quick enough, you’ll spend a whole bunch of time potentially building something that nobody wanted in the first place.

Well it’s neither the right of learning is never quicker than when you first prototype goes out, like when you start to sort of ask people to press buttons and try things and then well, the only time it gets quicker is when you go to alpha. So when we’ve got people in the heat of the moment trying to use the product and really find out whether … Is it the UX, is it the data, is it the underlying whatever?

Heath: And it’s so hard to backtrack from that, the further down the road you get.

James: Yeah, but if you don’t do it you can spend forever going down a path and every that’s the worst in my mind is when you build something that nobody wants. So when you talk about coming to Pluralsight, I met with Nate and first interview with Nate, he sort of had me in the first sentence of our first conversation ever together, when he said, “Look we’ve done X amount of customer interviews and we’ve pushed to production Y amount of times and X and Y were both in the thousands”. It showed that A, we are listening to the customer but B, we are quickly getting to production and understanding whether these insights and ideas work.

That’s hugely impactful. So I was massively happy to get into that sort of environment where, we’re listening, but we’re also quickly testing and that’s if we have and we have multiple active 30 teams or so, all doing that every day on every aspect of the Pluralsight experience. It’s really impressive. And it’s something that like when you see somebody, a team that’s really good at something you sort of stand back and like, “Hey, this is awesome. How do I add to that?”

Heath: Get out of the way and how can I help?

James: Yeah, how can I bring in the right resources and, and maybe tweak a few things, change a few things here, but ultimately, that flow of insight to product experience, how do we make that as informed and rapid and fluid as we possibly can so that we’re not overthinking it, but we’re taking educated hypotheses and taking those to production and really learning about it quickly. Then there’s also the psychological safety there of, “Hey, we took a shot and it didn’t quite work this time, but we’re gonna learn from that and tweak it and change it a few different ways and have another go”.

And that’s what the teams do here all day every day. And the only thing I get worried about is if they’re not taking those shots. How do we get them? How do we get the team’s going and taking as many at bats, as many swings and maybe every now again, join the guys for the home runs, rather than trying to bunt or push a single here and there, how do we get them really thinking big and really creating with possibility, that’s what I see as my role. And as a leader is just sort of means trying to inspire that sort of thinking, and being able to remove roadblocks when necessary.

Heath: What have I not asked that I should ask about data or Pluralsight?

James: I think there’s a growing need in the market for understanding what we’re doing here. I feel like we are learning a whole bunch of lessons about how to do product development. And I know Nate’s been great about sharing directive discovery with the world, we were evolving, directed discovery, there’s more to come, direct discovery like any, any process is never done. So I think there’s … I’ve been talking with a whole bunch of people in the data product world and in data science and data science works on a slightly different time scale, and a slightly definitely different likelihood of success. So we might try and data science to do something and you find out rapidly, you can’t do it or you find it, you can do it and it’s amazing, or you find out different things than what you went in with.

So we wanted to find X from the data set. We couldn’t do X, but we could do A, B and C and you’re like, “Oh, can we turn A B and C into a product?” “Actually maybe we can does anyone care about AB or C? Or how do we go back to the customer and then start the discovery loop again?” And then there’s also the thing of these product teams as we build out of a web based experience, as a designer, you can sit there and put a button or call to action on a web page and we’re 99.9% certain that our dev teams can build that web page or that button that does the thing, with data science you say, “Hey, I want to build here a matching algorithm or recommendation machine based on the data that we have”.

There might be a 75% chance that we can actually do that, or maybe 60% chance and how do we build in that uncertainty, into our process and development process. And there’s also a different timing as well. So it might take a bit longer than developing a web page, for example. So how do we not slow down the teams that had sort of doing that traditional software development process, but also take advantage of everything that we can get out of our data and data science? I don’t think anyone’s cracked the nut on that. So I think there’s a big sort of procedural hole in the landscape there. And I think that’s where the next sort of product development opportunity in terms of product development process is going to come from. I’m massively excited for that meta opportunity as well.

Heath: Well between yourself and Nate and Gil and others like that. If it can happen anywhere, it can happen at Pluralsight I’m sure, because you got the right team.

James: I feel like we’re in the right place at the right time and it just feels natural for me. So I’m very excited about that.

Heath: Awesome all right, Well thanks for joining us.

James: All right no worries. It’s been great.

Heath: All right.

Author Heath Umbach

Heath is an avid cyclist and runner who brings athletic rigor to everything he touches. With over 15 years of experience building and marketing digital products, he has a deep passion for solving our clients’ greatest challenges.

More posts from this author

Sign up to receive updates from our blog

What we do Expertise

From concept to design, we'll partner with your team to deliver amazing product and website experiences.

Recent Projects Work

See the results of our most recent digital product and website engagements.