Teaching Thursdays Video Series: How Enjoy’s Product Team Uses User Research & Creates Customer Archetypes.
Welcome to the first episode of Teaching Thursdays, a new video series from Instratify where we collaborate with some of the top minds in user research to teach you step-by-step how to:
- Do, analyze, or store user research and insights; OR
- Use customer research to hit a growth, marketing, revenue, or product goal
Sofia showed us the exact process her team at uses at Enjoy to uncover game-changing product insights, prioritize features, and build dynamic customer archetypes.
Ashley: Welcome to Instratify Teaching Thursdays. I am super happy to have Sofia the CEO and co-founder of one my favorite user research NomNom Insights, now Enjoy. They just recently did a great rebrand. Thank you so much for being here today, Sofia.
Sofia: Thank you so much for the invitation. I’m so very excited to share a bunch of stuff with you today.
Ashley: Yes, super. I love the topic, and I can’t wait to see the insider process that one of the top customer research tools uses themselves. But, why don’t you tell us a little bit more about yourself?
Sofia: Sure. So, I’m the founder and CEO of EnjoyHQ, and what we do is that we help pro teams, research, and design teams, to collaborate better together when it comes to customer research. So, we aggregate customer feedback on different sources, and we help researchers to bring all their user research data in one place, so they can do several things. Segment their user base, organize and research, and tell stories of what they are learning from customers and share that within the organization. So, we’ve been doing this for a while now, over three years, and yeah as we said before, we recently relaunched with a new name, but not only with a new name, just a bunch of new features, things that we’ve learned from our own customers from the last years.
Sofia: And, today, probably I’m going to show you, a bit of that research process, and how we come up with ideas in the past, and how we come up with ideas today, about product opportunities, where to invest time, and how to decide what to build next, and how to prioritize the information, but always looking into sourcing the inspiration from our customers, the feedback, and the research that we do.
Ashley: Which is probably one of the best examples of dogfooding that I could possibly have on Teaching Thursdays, so I’m pretty excited about that. So, you know, we’re gonna get started. Sofia’s going to walk us through how to identify and find game changing product insights with customer research. We’re going to switch to a screenshare now, and she’s going to walk us through step-by -step exactly how they do this. Sofia, take it away.
Sofia: Excellent. So, let me just check here . Brilliant. So, this is not gonna be like a very long talk, I just want to take you through some of the main points, and sort of little steps that we take in order to make sure that we are really listening to customers, and we are using their input in order to identify opportunities to grow our product. And, this is a very interesting approach because we have redefining these over, and over again, we have used several methodologies, several sort of research methods. We have been looking into different frameworks, and what I’m gonna show you here is a combination of many things. Things that we try, things that worked, things that we’re still testing … and we don’t know how they’re gonna unfold, but so far we’ve found it extremely helpful, when it comes to making sure that whatever ideas we have, are coming from our customers, and what are their real needs.
Sofia: So, one of the things that we always think about is, how do we used the numbers, the tracking, the quantity of information that we’re gathering from customers, and how do we use that data alongside the stories, what they actually are telling us, and why they are requesting things, for example, why they’re behaving in a specific way. We look at, of course, usage behavior, and that normally comes from Amplitude, which is the tool that we’re using primarily to identify behavior, and usage of different features. We look at, most of the time, what are the most used features, and we look at that information from the perspective of the personas of the archetypes that we already in our user base, based on our previous experience, or sometimes we look at that without segmentation.
Sofia: We try to bring, sort of quantitative information from different tools, not only from product analytics like Amplitude, but we also look at things like FullStory, is another tool that we love, that help us understand how people use the application, and where do they spend time, and where they struggle. So it’s just different, sort of ways, of looking at the same data, and obviously we also use cohort analysis for that type of information. And, that is not in isolation, we obviously look at what customers are saying, customer requests, common questions and complaints, in our case, that information comes from very different sources. We use Intercom for internal communication, we use form for surveys, we use Delighted for NPS, we use Google Drive for our own customer reviews, and transcriptions, and many other sources as well, email, and so on.
Sofia: So we have plenty of places to go to when it comes to understanding what customers are asking, and what are their typical complaints, and so on. And, of course, we also do proactive research studies, where we contact our customers and try to understand all of these different, sort of comments and requests in the context of their work, and whether or not that’s just the problem as we understood it, or there’s something else behind it. So we’re always going deeper with our customers. We build relationships with different ones, so we can really spend a lot of time understanding their workflow, and how that evolves. So user interviews are just a key part of the research process, and FullStory helps us identify people that we want to talk to as well –
Ashley: I love that.
Sofia: Yeah. So all of this information comes together, in our tool actually, we use our own tool, of course to be able to segment the data that we’re bringing from different sources, to segment that data based on events, and properties, that are coming from Amplitude, for example. So, we can do things like, “Show me all the users that are using this specific feature,” but they’re also talked about this in specific topic, on these three different channels –
Sofia: And so on. We can really cross reference information, and try to see everything in context. All of that, anyway, what we do with this information is that it really feeds into our understanding of our own segmentation, who our customers are as a whole. So, we use mostly the concept of archetypes, which is a very psychological –
Ashley: Can you explain that concept a little more for the audience, I think, most have a good understanding, but it’s one of those terms that gets thrown around that I think a definition would be great.
Sofia: Sure. So, in our case, archetypes are slightly similar to the approach that people have with personas, it’s just that … and I’ll show you some examples of our archetypes, cause I think it’s just better when you can see them. So, an archetype is looking into behavior. So, it’s understanding what drives a specific type of user, what is the environment that they are part of, and how they evolved, and what are their goals. So, if you are a very big fan of something like Jobs To Be Done, and you love that framework, you probably are gonna love the idea of building archetypes. Cause it’s just the same, it just helps you to see people, not through their job description, or their position, or whether or not they are in a level of influence, or in a position of interest in the company.
Sofia: So, when we look at archetypes, we’re trying to find and understand the person. The person and their contacts, and what motivates them, and what is influencing their behavior, and what actually helps them achieve anything. So, a lot of Jobs To Be Done, but we evolved from Jobs To Be Done, as a framework, into more archetypes, and I’ll show in a second why. They’re both super helpful, but we feel at this point that archetypes are probably gonna be better for our user base.
Ashley: Do you have a quick example of an archetype that you can just slip to for a second, so people can get a better grasp of what you mean?
Sofia: I actually think that’s the next one. Yes.
Sofia: So, I have many of those, and I’m just gonna show you one example here.
Ashley: It’s perfect.
Sofia: So, obviously you add a name, as you would do with persona, so we have this archetype called superstar evangelist, and this archetype in particular tends to be a big VP of Product, or a product leader. It doesn’t need to be that title in particular, it could be somebody that is going to become a product leader, and has been working in organization for a long time. This person is really good at getting people together, and aligning them. It’s like a natural leader within the organization, and they deeply believe in getting closer to customers. So it’s a product leader that understands, and gets frustrated about businesses that don’t pay attention to customers, that do not have processes around research, and looking at data, and really understanding the use cases.
Sofia: So, these are more, description, of the personality of that individual. So you gonna see that person and say, “All right, this is Lisa, let me tell you about this. She does this, and she likes this, and she hates that, and she works best when this circumstances are present,” and so on. So you’re going to have a narrative behind that person.
Ashley: So two things I noticed, and I want to point them out. So one, is I don’t see any demographic data. Age, sex, location, about the revenue company does, how many years of experience Lisa has in her role … so can you speak to that, because often when we think of personas, we think of very demographic type factors. This is much more, as you said, personality-driven, which I think is fantastic, and that’s what we do as well, but I’d love to hear your perspective on that.
Sofia: All the data around demographics, that is still extremely helpful, because it allows you to find patterns within your user base, for sure –
Sofia: So, what we try to do here is, initially have a really deep perspective of who the person is, and really understand them. You can say, well “Do you really know your customers?” Yes I know them, “So describe them for me.” And that’s a human. And as a human, you don’t describe your friends, or your colleagues, with their demographics. You describe them by things that they like, they dislike, how they behave, and so on. So we try to first get to this stage. Then, of course, behind this archetype there is a product usage. So Lisa, in this case, which is the superstar evangelist, she probably uses my product in a very specific way. She probably uses a specific feature, she has a specific sort of cadence of how often she comes to the product, the type of request that she’s asking.
Sofia: So when you look at your data pack, you can then group your users based on how many people look like Lisa, when it comes to product usage. How many people have this type of request, and are they very similar to this type of usage. So just are kind of cross referencing. So, we know about Lisa because we were looking into, for example, Amplitude’s specific features, we were looking into requests, and we were looking into part of demographics as well, like what are their titles, what they’re doing, what they’re asking, these things. And then you start painting a picture that evolves. This is not something that happens overnight, we’ve been doing for Jobs To Be Done, and many over things and personas, and we keep evolving. So, these archetypes is a result of a lot of research, and a lot of cross referencing.
Sofia: So, when I see these archetypes, I’m not seeing only the personality traits that I’m describing here, we also know behind the scenes what are the numbers behind this, what are their usage, what is the experience that these type of customers, you know, have with us. What is the type of contract, all that information. But we don’t put it here necessarily because it distracts you from really going deeper into the behavior. So all that information is lovely, and we have it as well, but when it comes to explaining to engineers, this is a great example. And we talk to engineers about why we’re building things that we’re building, they can relate to this. If I talk about only usage, they will think about solutions first, they will think about fixes, they will think about estimates, and all the different things.
Sofia: When it comes to really empathizing with a customer, this is way better, and you can see in the conversations. They actually refer internally to the customers that we have, when we’re doing customer support, and they pop up on Intercom.They right away will say things like “Oh, that’s a superstar,” or “that’s x y and z archetype.” They actually get really excited about all of it and in the psychology.
Ashley: No, that’s fantastic. And you said a couple things there that I wanna just make sure I highlight. One, you said, you don’t describe your friends as 30 to 35, single, within a 100, 150 thousand dollar bracket. Has a cat, has a dog, lives in this area. You said describe them, your archetypes, like you would a friend. And I think that is such a great way, and a simple way, and a clear way to think about archetypes and personas, and things like that –
Ashley: So I love that. And I also love your explanation for any marketers, product managers, non-technical founders, in terms of working with engineers, and things like that, is when you give them cold hard data, that usage data, it’s about fixes and solutions, versus first the empathy to make sure you’re actually fixing the right problem first, and I love that. Is Lisa an actual customer of yours? I have to ask.
Sofia: No, she has stock photo, it’s not an actual customer.
Ashley: Stock photo. Okay.
Sofia: I wouldn’t like to show an actual customer here, without their permission, so I just try to get a new one.
Ashley: Internally, do you guys use an actual picture of your customer?
Ashley: That’s a great practice
Sofia: Absolutely. And it does help at times, to create empathy, which is something that I see most people do when it comes to personas. They use other pictures, they are not the actual customers, and that doesn’t help, so we do definitely use our own customers. One thing that I wanted to add here, the way we build the archetypes, they come from a lot of customer interviews. So it’s not only customer feedback, and product tracking, and so on … it’s a lot of customer interviews, face to face, going to their offices, and really understanding this over and over again, over time. So we have built this understanding by doing not only, “Let’s do twenty customer interviews and define our personas,” which is a difficult process and then you forget about it, and you assume that those personas will be forever –
Sofia: Yeah. So you will keep doing this, and add to that understanding, on an ongoing basis, and now until you see the patterns that are so, so clear. So for us at the moment, we will start talking to a prospect, and we just get on a demo, for example, this is very, very, easy for us to know, “This person is definitely this archetype.” You can start seeing all the patterns around it. So it’s just very powerful sales. It’s just a common language that you build inside a company, but it’s a constant thing. You can’t build up pattern recognition just by doing a bunch of interviews once a year.
Ashley: Absolutely not, I think you brought up an excellent point. So I wanna ask you first, where do you keep your archetypes, so that all the different departments and functions can have access to them. That’s one question I get a lot.
Sofia: Yeah. So, actually archetypes, we have multiple representations that we do internally, about how this research process evolved, and we do have, on a key note, and I do a presentation. So, we do have all of our customers interviews and enjoy, all of our customers feedback and enjoy. So whenever we are analyzing customer feedback, or analyzing a user interview, we will add comments and classify that information as, “Okay, this is archetype, you know, superstar evangelist,” for example. So I will show you in a second, what that looks like, cause we can build, literally a graph, of how much feedback we received from each archetype.
Sofia: So, we try to match that together.
Ashley: Yeah, that’s fantastic. One thing you mentioned, and I thought this was brilliant, was that personas often are a one and done thing, they’re static, when realistically, markets are dynamic, and if your markets are dynamic, then of course your personas and archetypes are dynamic as well. You continually update those, and that’s fantastic, and that’s why you’re doing so well. So, why don’t you tell us a little bit about the cadence. How do you do it, do you do it once a quarter, you make sure you’re updating them, is there a regular cadence you have to make sure they’re always updated?
Sofia: No. This is a weekly and bi-weekly basis, and this happens because we do have customer interviews for research, in depth research, or just follow ups with customers. We try not do this thing about, a bunch of customers, the reason we wanted to, because it always feels like a project. It always feels just like a project. So what we try to do is that we’re talking to a customer, and that may mean long interview about a specific problem they want to understand, and sometimes may mean a chat, and things happen out of that. So, we try to capture all these different conversations and think about them as customer research as a whole, and not necessarily as a one-off project. So, whenever we have demos with customers, or potential customers, that feeds into our understanding of personas, because then will eventually become customers, and if not, we can understand why, and the difference.
Sofia: So, it is a constant thing, so we bring that update. Now, when we really find this, a specific sort of slide, if you like, that will happen on a monthly basis. Sometimes it changes, sometimes it doesn’t change. And it’s because we have, sort of a monthly business catch up with the entire business, and we go through our customers over and over again, so that’s an opportunity to change this sort of, slide, representation if you like. But, it evolves on a weekly basis, for sure.
Ashley: So, just to sum that up, basically you have an all hands meeting monthly, and at that time you bring all the understanding together, and you update, if necessary, because just as important as changing and updating, is knowing that it’s actually still the same. So you guys do that on a monthly basis, and not only from, what you call proactive user interviews, but also from feedback you’re getting from demos, feature requests, customer support tickets, things like that as well.
Sofia: Absolutely. Yes. And again, what is interesting there is that, what makes it successful, at least for us, is that it is always embedded in our language. So it’s not something that we talk about, in terms of research projects. It is always something that we talk about on a daily basis. We’re talking to customers and then this is an archetype, x y and z. And when we talk, it’s called features with engineers. We talk about it, from the Jobs To Be Done perspective, and who is the type of person that will be using that, and why it’s important for them. So, I think it’s just incorporating the language of these archetypes, on every opportunity that you have, when you’re working with your team, will embed the understanding, and involve the understanding with them.
Sofia: I don’t know if that makes sense but it –
Ashley: It does, it does. So, I think what I’m hearing, and please tell me if I’m mistaken, is that the culture is so customer first. It’s not a research first culture, it’s a customer first culture. So what that means is, it’s just natural. You hire for this, it’s part of the culture, that when you’re talking about doing anything. And I would like you to tell the audience, for those who aren’t familiar what Jobs To Be Done is, what it is just very quickly, if you don’t mind. Your culture is so embedded in the Jobs To Be Done, or the persona. That’s just naturally how you talk about it, it’s not like research is a one off project. It’s getting close to the customers, talking to them consistently. It’s just something you do all the time, so you talk like that all the time?
Sofia: Yes, correct. So, I think there are so many things written around Jobs To Be Done, but essentially, it’s trying to understand what people wants to achieve with your products, but mostly what are they hiring you to do. You know, what are they hiring you to do. So, if you’re product was a person, and the person is to do a job, what are they hiring that product to do. And, this is what Jobs To Be Done as a framework is. So, people hire my product because they want to be able to understand their customers, and identify opportunities so that they can build their experiences for those customers. And that’s a very broad kind of job to be done. Description, you have to be way more specific than that, but it’s all about making progress. How do you help an individual to make progress, and how do you understand why they use your product within the framework of, they’re hiring you to do a job? They could hire a person, they could hire a tool, they could hire anything else.
Sofia: But in this case, they’re choosing to hire your product. So, what is a function that you have? And that function sometimes goes way beyond what you think your product does, and what you think your features are for. So, it’s just a very interesting. I don’t know if it’s doing justice to the entire framework.
Ashley: No, it’s a huge framework, and I think that was a really good, brief description. We’ll definitely be doing a Teaching Thursday on Jobs To Be Done. But, and I’ll include this in the show links as well, but just as kind of quick example for those not familiar, there’s a famous video, you can Google it, it’s called Clayton Christensen milkshake video. And basically, everyone knows McDonalds, or most people know McDonalds … so they were hired to figure out how to sell more milkshakes. So they took the approach to figuring out what job does the milkshake do. Is the milkshake a snack, is the milkshake a breakfast item, is the milkshake something to fidget with? And what they actually discovered, was that the milkshake’s job was none of those things.
Ashley: It wasn’t a drink or food, so much it was something to keep them busy while they were driving their morning commute. So that insight, of course, led to lots of things. But that’s an example, that’s the milkshake’s job at McDonalds, is to keep someone busy … with the thick slurpy feeling of milkshake on their commute. That’s exactly what Sofia is talking about there. One other thing that I wanted to quickly mention, if you wouldn’t mind going back to your slide beforehand, one previous. I also get a lot of questions about trying to understand the difference quantitative and qualitative data, I think everyone generally understands both, but do you have a quick definition of how you look at both, and how you define both for the audience?
Sofia: You know what? Over the years, what I’ve realized is that, I like to see them all as data. So, when we try to divide them, is when the whole conflict exists, and I understand people need to know the difference. But, normally it’s just anything that you know, and that you have access to about your customers. Sometimes it looks like numbers, sometimes it looks like words, sometimes it looks like videos and images, sometimes looks like a graph. So, it’s just understanding. Anything that can give you a hint, and a clue, of who your customers are, and how they behave. So, for the quantitative side of things, it is all about understanding, you know, sessions, usage, and cohorts, and whether or not they are onboarding their self correctly.
Sofia: Is, so many metrics, my god. You’re tracking every single interaction in the app, and how utilizing all these micro conversions, if you like, that you have in your product. All of those are data points that are extremely important to understand, whether or not your product is successful. And on the other hand, you will have, what they say. So, not only what they do with your products, and your services, but what they actually express. And that’s filled with feelings. That is filled with a lot of anger and emotions, and sometimes happiness. And you also have to make sense of that information cause that’s a human side, the psychology side of your customers. So in our case, when we look at that data we are looking to customer feedback, complaints, and requests that come from different places, we’re looking into the narrative and the story that the put together when they’re talking to you, when you ask questions about their problems.
Sofia: It is all about how they interact with the app, and you’re observing how they’re actually working with things, and if they struggle or not. User testing sessions for example. So, for me, I try to think about customer research as just, an infinite variety of data points. The more variety, the better. It’s like your greens, right? When you say, you have to eat healthy and you should have a lot of colors in your plate if you want to eat healthy.
Ashley: Love that.
Sofia: It’s the same with data. So if you want to have a really nice understanding, a healthy understanding of your customers, you probably want to have numbers there, you want to have pictures, you wanna have conversations, you wanna have everything that you can, and try to see it all in context.
Ashley: No, I love that analogy, I think that’s fantastic. Would you say, generally speaking, that quantitative might be more numbers based, where qualitative might be more text based?
Sofia: Yeah, absolutely, yeah. By definition, yes, absolutely. They get sometimes, a little bit blurred, and that is okay. But, as I said before, it’s all about understanding that this is just context, and you’re really trying to have as much as you can. So, therefore, having more data points will help you, together.
Ashley: No, I love that. And before we continue, and I really love that analogy, I’ll have to borrow that one and give you full credit, is, can you explain the difference between … we’ve thrown around two couple different terms. One is customer research, and the other is customer feedback. Can you differentiate between these two terms, or is it the same thing?
Sofia: No, as I said, they’re very different. So customer feedback is anything that people are communicating to you on a practical level. So, they reach out to you because they’re angry, because something is broken in your app and they cannot get their job done, that is customer feedback. If somebody comes and say, “Oh, I would love if your product had x y and z.” That is customer feedback, or an NPS survey. “I’m not attracted to you because, you didn’t talk to me when I opened this ticket, and then now I’ve been waiting for three weeks,” that is customer feedback. Anything that people tells you, it is customer feedback. And I see that as just data, data points. Customer research is the process of making sense of those data points.
Sofia: So, being able to use and pick the right methodologies, the right frameworks that will help you make sense of that data, based on the type of data that you’re working with. So, you will have to code, when you look at interviews you might analyze interviews with different types of frameworks, and code an interview in different ways, that you would necessarily look at customer feedback. Customer feedback analysis needs more of a contextual, sort of approach, where you have to do some segmentation and understand, that this is feedback relevant in the context of our strategies, this feedback is relevant in the context of the segments of customers that we’re focused on. So you do be more in depth contextual analysis, I will say, within customer feedback. You want to see it in different ways.
Sofia: It’s just data points, it’s not necessarily the truth. It’s an indication that somebody has a specific request, or emotion, or need. There might be a problem, it’s just an indication that there might be a problem somewhere or an opportunity somewhere, and it’s your job to go into customer research, to really understand that opportunity. And I think, over the years. product managers, I think, have taken this to the extreme, right? They say, “Oh, well, if you ask your customers what to build,” and you’re gonna build the wrong thing, because it’s like faster horses, and just all this. And it’s just unnecessary, because nobody says that you have to build exactly what customers are asking, it’s all about understanding that as an important data point that leads you into a place where you can expand and explore.
Sofia: And, that is all it is. And, therefore, nobody should be scared of looking at customer feedback, or too weary of paying too much attention about what customers are saying. If you understand how customer research work, you will appreciate customer feedback even more, cause it will give you a little bit of a map, of where to start your research. And that’s all it is. So, that’s just a bit different. One is a process, the other one is just data points.
Ashley: No, I think that’s a great definition. So a user interview transcript is feedback, an NPS survey is feedback … live chat, an Intercom ticket, is feedback, an email somebody sends is feedback. But the research is how you make sense of all of that different feedback together, what lens you’re gonna use to look at that data, and how you make sense of it. Does that sum that up correctly?
Sofia: Absolutely. That is –
Ashley: I think that’s a great definition for our audience, and it does vary from company to company. Often, those two terms are confused, so that’s why I wanted to make sure our audience was clear on that, so thank you.
Sofia: Great. So, shall we continue?
Sofia: Excellent. So, I’m going to jump here to all the different archetypes, because we already talked about how we get there. But I wanted to show you a little bit about segmentation, so when we look at, and this is going back to the numbers, so once you have a very strong definition, that you feel very confident about your definition of archetypes within your user base, then you use the numbers to try and figure out how many of them you have, and what is the distribution. So, once you define that Lisa does these things normally, and use the application, the app, for these reasons, and therefore these features are relevant to her … you can go back, and see your user base in that context and say, “Okay, great we have this many superstars.” At least, it’s an assumption.
Sofia: I’m assuming, that based on their behavior, and what we have researched, this is actually the number of people that we have within these archetypes. And you can start validating that by persuading more customer interviews, and trying to follow up and track that behavior. But why is this important to know? It is important to understand how, all these different archetypes work together. In our case, in particular, and it will be very different to other products out there, in our case, we are a collaboration tool. We’re a team feature. So, team products, sorry. So, therefore our product’s being used by multiple people in the organization. So for us it’s very important to understand how all these different archetypes work together, collaborate together, and what are the relationships between them. And also, why, if we had more of a specific archetype, what would that mean for the business.
Sofia: Is that more interesting financially, for example, the derivative investigator, because they use a product in a way that generates vital loops? Or, they use a product in a way that is more strategic to where we want to go. So, you look at your archetypes, and then you try to circulate those into real people, and how many you have of them, and what are the implications of that combination of archetypes, and then it becomes a lot more interesting when you think about strategy, right?
Ashley: Absolutely. And I think an important point that you said there, as well as looking at the actual metrics, revenue generated, lifetime value of the different segments, and what mix leads ultimately to the most revenue growth, long term.
Sofia: Absolutely. For example, churn. As a SaaS company, you want to know, okay, these archetypes are great, and on paper they look like they’re all great users, interesting customers. But maybe, we’ve archetyped into churn very easily, or maybe one archetype really needs to have another archetype in their teams in order to stay active, in the app.
Ashley: That’s so critical.
Sofia: There’s a lot of insights that come from this exercise. And again, it has to happen on an ongoing basis. You cannot get to one insight here and assume that that’s gonna stay the same, or assume that that was correct, because maybe you’re doing this analysis where you really need a bunch of features, and that implements the behavior or your customers. Or maybe there was something in the market that was happening, and that analysis that you did was very contextual to that that specific time in the market. So, and generally speaking, you want to do this often. But it’s super cool to do it, when you have the numbers, the statistics, everything into context. I love doing this, and understanding the distribution of archetypes.
Sofia: And why? Why this is happening, right? Why do we have more people that behave this way, and different people that behave this way, and what does that mean. So –
Ashley: You know, I think that’s such a great point. I think often people make personas, or archetypes, and they use them to write their messaging, and marketing, and look at them on your own product, but they’re not taking this big bird’s level look at the analysis, and looking okay, “This persona looks great, but is it churning?” Or, “This persona is great, it has the highest lifetime value,” without realizing it also, you need the researcher and the superstar, or the superstar even though it has the highest lifetime value, is going to churn out too fast. That’s such a critical point about that bird’s eye level analysis that often sometimes can get missed. I love that.
Sofia: Sometimes people might feel that, and I know this because previous companies they used to work for, it’s difficult to … you cannot necessarily assume that this is 100 percent correct. But, you have to have some sort of assumptions to validate. So, when I look at this, I know that I do have a distribution of archetypes. Am I a 100 percent sure that this is exactly the distribution that we have in our customer base? Not necessarily, but I’m absolutely confident that, to get to this graph, we understood enough about our customers to feel okay with that 80 percent of time.
Ashley: I think that’s also a fantastic point that you’re making there, is that you’re not certain this is right, there’s a degree of uncertainty, but it’s a lot more evidence than you would have if you didn’t do the work to get here. And you’re okay with that… 70, 80 percent’s great?
Sofia: Absolutely, yeah. And it’s about feeling comfortable with that. And I know, a lot of the managers and researchers get pushback around the research that they do, because they might feel, “Okay, well, that’s fine, but that doesn’t translate into the 10,000 customers that we have,” how do you know that five people, or 20 people you interviewed translated to that? And this is why looking at the numbers work. And that’s why giving more context to those archetypes work. Cause you can then see some data points and try and to see, okay well, this is a behavior we can understand as a pattern, and therefore assume, that this chunk of people exist, within our customer base. But this is the type of finances you have to do, go very deep with all the data points that you have in hand in order to be able to assume these things, no and there’s no way to be perfect, but it will take you further.
Ashley: No, I love that. I thank you so much for sharing that with the audience. I do want to ask you one thing quickly, we didn’t get a chance in the beginning, and I should’ve elaborated on this and asked you before NomNom , or EnjoyHQ it’s going take me a little bit of time to get used to the change. My apologies. What kind of roles did you have? Are you a researcher by training, or is your background in product? What did you do before founding Enjoy?
Sofia: Yeah, so, I before I started with Enjoy, I was the Head of Growth at a company called Geckoboard, which is a data analytics dashboard. So I come from, and I was there, I think, three to four years. And, actually that was the birth of just, creating and starting the business, because I wasn’t part of our product indirectly, but I was working with them. I wasn’t part, necessarily, of the customer support team, but I was working with them. I wasn’t part of many teams, but it was the whole idea of how to generate growth, which is a cross-functional sort of experience. So my role there was to understand how all these pieces come together, how our product can understand customers better to build different products, how the growth team can understand customers better to build better campaigns, and position … How customer support can understand better to build their experiences and so on, and eliminating churn in general.
Sofia: So, my previous roles were mostly around growth, and at the previous company one of the biggest challenges that we faced was alignment. So how do you bring everybody together into a cohesive understand of customers. How do you bring everybody together into ongoing understanding of those customers? And how you use the same data that we are all gathering for the benefit of multiple teams, because everybody will do their own user research. Everybody will have marketing teams, did this research and personas, but then product will have jobs to be done, framework, and the other team will have a different type of approach to the same data. But, the sharing and the understanding was very siloed. So, that’s my background.
Sofia: Before, I worked for a number of companies in so many levels. I worked for very large companies, consumer companies, I worked for large agencies, I worked for government. I’ve been jumping into industries and it always been related to either customer experience and growth.
Ashley: Yeah, I think it’s a very useful lens for everyone to have, gives you such a breath of experience. I have to ask you, when we think of research, we think about UX Research, or Researchers Ops, certainly for product teams … but, in your role of Director of Growth, would you have used a tool like Enjoy, would you have brought that into the organization, if it wasn’t already there?
Sofia: Yeah, absolutely, cause at the time it was all very manual, we had to do this with spreadsheets and searching each of the tools to find things. I remember that very vividly, many many occasions where we would do a position campaign, around sort of like an experiment, growth experiment. And, we would get feedback, but because we didn’t have access to Intercom at the time, we used to use Intercom as well, necessarily as a tool, the product team had access to that and didn’t know that we were getting feedback from that campaign, and we were getting actual customers talking to us about the effort that we were having on this side of the business. And it was just by chance by my asking and logging that I realized, oh my god, we have a bunch of responses here that we could’ve done something with it. We could have talked to those customers, we could have got something going into the initiative that we were building, and it’s just silly things like that that makes you realize, “Wow, this is the details.”
Sofia: When the communication breaks, and the lack of awareness around what is available in the organization is too strong, the whole experience for the customer gets so fragmented that it’s just no way to improve it, because everybody’s calling from a different direction. So, that was part of my experience and the reason why I wanted to do this. Because, if it’s too manual for you to do customer research, you won’t do it. Because it’s difficult already, even if you have all the tools that you need. It’s super hard to do it cause it takes time, you have to analyze it, sometimes you have to learn a bunch of things that you don’t know. It is hard, if you have the resources. When you don’t have the resources, or the tools, you just won’t do it because it’s too hard. So you have to help yourself, to do the job. You have to help yourself to have the right people on board, the right tools, so you can allow for an environment that incentivizes people to do research. That’d be my experience.
Ashley: No, absolutely, and that’s my experience as well. And I see it, certainly, as an organization growth tool, but certainly within a growth function as well, so that’s why I wanted to ask you about that because I think that’s kind of an underutilized use case, certainly, in my experience.
Sofia: Well, when product teams exist, they exist to build amazing experiences for products that help the business grow. So you’re always gonna be tied to growth. If you’re a user researcher, you are discovering insights because you want to drive growth into the business, nobody does research because they wanted to do research, and that’s it. Everybody wants to make an impact. So if you’re a product team, if you’re part of a research team, a design team, you are part of the growth team anyway. So –
Ashley: Love that.
Sofia: So, you have to align everything that you do towards how that helps the business grow and develop, and find new opportunities. So, I come to this specific problem with that kind of mindset, because I don’t have a product management role. I became a product manager because I had to build this business, so I had to become my own.
Ashley: Perfect. We lost you there for a second, but I think that’s a 100 percent, growth is across the whole organization, and I see Enjoy and this is how we’ve started looking at deploying it, as not only a research function or a research repository, which it most certainly is, but as a growth tool for the whole organization to democratize user insight, and that shared understanding. I love that, thank you for sharing that.
Sofia: Okay. I just want to show you a couple of things, cause I know it’s just going over time a little bit, but I wanted to show you another thing that we do, which is segmentation around usage. So here, I was talking about the distribution of archetypes around your customer base, but then when it comes to the product usage on its own, we also look at what is the distribution of people actually using the application. So here is an interesting graph when you have the 20 percent of your most, sort of powerful archetypes, they actually drive 80 percent of the usage of the other archetypes, and we could go very deep here because this is a whole bunch of analysis, but I wanted to show you that you can also get the archetypes from a segmentation point of view when you look at just product usage, and what is the actual distribution there as well.
Ashley: Perfect, now I love that.
Sofia: So if you had a specific archetype, will that drive more adoption of other types of archetypes for example?
Ashley: So if you were to read this one, the superstar evangelist, drives the highest network effect?
Sofia: Exactly. So if you have more of those, they will bring more people of other types of archetypes, so it’s just viral effects and so on. There’s a lot of interesting analysis once you have the foundations there and you have all the data, and the archetypes, you can build different ways at looking at the analysis and at least get inspiration from some interesting questions, research questions, or opportunity for building product. I’m gonna go through this super quickly, cause I wanted to show you that we also do this exercise, and again this is very driven around understanding how these archetypes behave in real life. We think about these things as “scenes”, and this is the part of our experience design where you are thinking about well, you have an archetype, when the archetype is most likely to use your product, what they’re gonna be doing, and why they’re doing that thing. And just understanding those scenes.
Sofia: And because the scenes have all these sort of variations where you are experiencing this, which is I am doing something and I’m feeling happy, and then something else happens, and I’m not feeling that happy and then I’m okay, and I just carry on doing my activity, but then something happens and okay, super excited again … you know we have this every single day. Every single hour, we are experiencing different feelings and emotions depending on what we’re doing. So your customers will be going through an experience, with your product, with our feature, or with our workflow, over the entire project that they’re doing. So, you wanna understand what are the ups and downs of that experience. So for a specific archetype, you want to see what are the opportunities that we have to make that person really happy or successful. And what are the times when we’re doing a really bad job. When do we think that we’re doing a bad job through our experience?
Sofia: And that experience is a scene, right? So, this a bit of how we think about it, and when it comes to product opportunities, so, for example if you look at experience, a story’s great for this. Because with story, you think you have a problem on a specific workflow feature, you look where people are struggling, you look at the data … okay this archetype is struggling with these specific workflows, let’s put an assumption in. If we change X, Y, and Z, we should see this engagement happen. We should see that there will not be any friction, we should see more people using this other feature. So you create your hypothesis based on where you think the experience is not working. But you choose those experiences, before you pick those experiences and you create those hypothesis, you’re looking at those through the lenses of the archetype.
Sofia: So I go to FullStory, and I’m looking at my superstars, and I think, okay “My superstars are struggling on this workflow.” What is the assumption? What are they doing? It’s not only clicking stuff … what they were doing? Were they in a meeting, and in that meeting they were talking to the boss, and the boss was frustrated, and they tried to find information here and they couldn’t? So just trying to understand the context behind it, and you can talk to those customers and understand that, right. That’s why you talk to them. But it’s about identifying parts of the experience that could be better.
Ashley: Okay. So this would be, for instance, the experience that one particular archetype would experience using Enjoy?
Sofia: Yes. So let’s say that this is a spring planning session.
Sofia: From the beginning to the end, and that might be two hours, about three hours of planning with several product managers, engineers, and designers, and so on. And they’re looking at the feedback that happens over the last two weeks from the last release, and they’re super happy, looking at graph, “Oh man we got a lot of feedback look at these customer quotes”, it seems like it’s working. I also see this data here that shows the usage… the engagement increased. So then they realize that they cannot go deeper with just a graph. They want to discover more, but they can’t climb any deeper. So everybody’s disappointed in the meeting, and they are like, “Ah man that sucks that we cannot look at what is behind this graph.” So that is my opportunity as a product, and if I say, “Okay there’s something going on there.” What if you do something different to avoid that experience. You want to just keep people excited and productive.
Sofia: So, therefore we had an opportunity there to improve how you drill down through reports. That’s just an example.
Ashley: No that makes a lot of sense. Just quickly, so the audience can understand, what does M1, M2, M3, M4 stand for?
Ashley: Moments? Perfect.
Sofia: Yeah. So there are critical moments in your experience, right? You’re not gonna be, necessarily, the driver of the whole experience, as a product you are part of many other experiences, right, many other products and workflows, and so on. So you have to identify what are the moments that matter, and what are the moments that you really have to address, cause you can’t address everything.
Ashley: And you guys rely on FullStory to watch users’ experiences in your product to then draw this type of graph and map?
Sofia: There’s a lot of things. So we get feedback, people struggling, a request that keeps becoming repetitive. We look at that immediately in FullStory. We see the pattern of, is this happening often, how bad is it … then we talk to customers. “What were you doing, can we jump on a call?” And then we’re going to understand what’s the context behind it, we compare that to the understanding of archetypes that we already have, so it’s like an iterative process, and this might become a feature, or this might become a face, or this might become something that we don’t wanna do, because it is outside the scope of what the experience that we want to provide is, and it’s just part of the process of them doing the research beyond what we should do as a product.
Sofia: So, there are different decisions that are made here, but definitely FullStory helps a ton, to contextualize that. Then, talking to customers, then of course listening to what people are saying and complaining about, of course.
Ashley: Perfect. Thank you for explaining that.
Sofia: Brilliant. So I’m gonna just take you through these, very quickly. So, as we go through understanding the archetypes, the usage, the distribution and the user base, your customer base, and putting all these data points together, ultimately you have to make something practical, right. This has to turn into some work that people are gonna do, that you’re gonna test, hypothesis, something, right. So, after all the research you start creating, you understood the opportunities in your different scenes and experience. You understood what drive the main archetype, and so on. So you start making sort of an idea picture of what will be interesting things to do, to drive whatever metric is important to you. So, more engagement, less friction for conversion, more revenue for selling. Whatever is the metric that you’re focusing on, as your North Star Metric, for the product, and the business –
Ashley: And if it’s us?
Sofia: You then, can think about all the research that you did, and opportunities that will marry those together. Right? Cause you are not a psychologist, your idea of understanding your customers is not for you to understand your customers, it’s for you to drive the understanding into something tangible for the business, so –
Ashley: It’s not an academic study, yes.
Sofia: Yeah. You’re not doing a PHD on your customer, you’re trying to understand how that customer will drive more revenue at the end of the day. So, here I’m gonna show you this, which is a bet. So, after all the research, what we do is we take the understanding, and we start designing bets. So, a bet is basically a hypothesis that might have a positive effect, and that might also have a negative affect, and the more you try to skim those things, the better. So, for an example, this is an interesting bet, and it’s actually a feature. So we discovered that people had, in our product, challenges using search. And that was stopping engagement within their onboarding process. We noticed that by people complaining, by looking at Fullstory, by us looking at search as a featuring Amplitude, and why people were using it or not, and by talking to customers.
Sofia: So we did all of that, and then we noticed, “Okay well, we do have an issue here.” And we noticed that was affecting particularly onboarding, cause it’s the first feature that people wanted to use, and so that experienced needed to improve. So, we understood that people were not prepared to create complex queries in the beginning, because they were very new to the product, and then that they wanted to have more flexibility in terms of the search results, how search results were presented. So, we said well if we build a better experience that has x y and z, so you have to define what that bet experience looks like, or might be, we might have this positive effect. So, still have the name NomNom here but users will use this more often, I’ll get to feedback faster, or users will execute, actually less, searches because they will get the results faster as well.
Sofia: So there will be different positive effects of this, that we assume, based on what we know that will happen. But, there might be also negative stuff. So the worse that can happen is that they don’t find it helpful and it just becomes even more confusing. So…
Sofia: You want to assume also what can go wrong, because most likely it will go wrong. That’s the beautiful thing about product management, that most of the things you assume tend to be wrong, and so if you only create your hypothesis thinking about the positive outcome, but the negative you ground your understanding, you ground through your experience because you start thinking about, “How am I gonna measure if it goes wrong?” Cause we normally think about, “Let’s measure to see if it validates it,” and everything was fine. But how can you build more measurements around going really badly, and understanding how that is affecting the rest of the experience. So, we build that, we create prototypes of what that solution might be, after talking to a lot of customers, and then you get more feedback on that and redefine it, and then finally you launch your first version of that solution that’s supposed to impact a specific part of the journey, a specific type of archetype, a specific type of outcome, which is more engagement within onboarding…
Sofia: But all of that ties together. So what we do is just have a bunch of bets, as we put two bets here in sample, some of them will be features, some of them will be pricing, some of them will be other types of experience. It doesn’t have to be only features.
Ashley: Messaging, or copy.
Sofia: Yes. And that is the beauty of it, because as a product team, as a researcher, as a product manager, you have the opportunity to really affect other parts of the business, if you stop thinking about research as feature factory. You know. About the research as a broader thing, as a way of enabling other teams and helping other teams to find interesting things to test, and empowering teams to try different things. Then you have an opportunity to have a massive impact in the business, cause maybe a bet that comes out of your product planning, is actually a bet this gonna be executed by the growth team, or the design team, or whoever else … customer success. So, you can really have an impact by doing great customer research across the entire organization.
Sofia: So, in this case the one that I’m showing here is a feature, in specific, but it doesn’t have to be.
Sofia: And then of course you have to have all your tracking, to make sure that you understand whether or not things or working, you have your cohorts to make sure that people that was exposed to this, this happened, people that wasn’t exposed to this, bla bla bla. All this type of sort of follow up. But this is how we think about it, so we end up every month with bets, that we want to accomplish. And then of course you have different problems with this. You will have many bets, which one we start first, which one we believe is more impractical, and all the different opinions, and that’s just a normal thing about running a business, prioritization is hard. But overly speaking, it feels way better when you are confident because you have a lot of background research, and you understand the people you are trying to serve.
Sofia: So, this is a bit of how our process work.
Ashley: So you find all the customer feedback and research you do helps you, as well as your team, prioritize a little bit easier?
Ashley: Cause it’s always a challenge to prioritize everyone has finite resources.
Sofia: Absolutely. And it’s always going to be, because even though you have all that research, what the research does is give you confidence and context, so you minimize the risk of building things that are not gonna be used, or building things that have no impact. You definitely minimize it, however you can predict 100 percent what the impact is gonna be. Sometimes it feels very straight forward, sometimes it’s a nuisance around what you’re trying to do, especially if you’re building completely new experience, or you’re trying a completely new opportunity, building a completely new product within your business, existing business.
Sofia: Improving an existing product is all about the experience. So, you can find other opportunities, and it’s a very different animal, if you like. When you’re improving a product, the customer research help using hands, the existing experience on the that is made. But when you’re doing something that’s completely towards innovation … we’re gonna launch a completely new product, we’re gonna launch a new experience for a completely new archetype, completely new market. That’s when you can do a lot of research and minimize that risk, but you still kind of assume that that will always be true, and it will always be affected. So there will be risk, but you can kind of minimize it by doing the appropriate research.
Ashley: Yeah, and I think that’s a great point. So, just to kind of recap, for everyone to kind of get here by looking at your customer feedback, and well, specifically product … this particular bet, which I love, I haven’t seen this system much, if at all, so I think it might be that unique to you, and EnjoyHQ. It’s very interesting, but to kind of what we get here, you’ve isolated via your quantitative data, via Amplitude, that in the onboarding process that people drop after trying several searching options, from talking to customers, from looking at help tickets, from looking at feedback they’ve sent you, from running practice field studies, user interviews, user interviews, user testing … all of these different modes of customer feedback, you’ve seen people are not being successful with this search feature. So that’s the insight that you have.
Ashley: And you have some acute reasons here that you’ve filled in that context with based on that research. So now, based on that, now if we change it, what could or could not happen? Both positively and negatively. Does that sum it up how you’ve gotten to that point?
Sofia: And it’s just a cycle. Right, you do this and then you have to get back to the feedback that you’re receiving on this new experience that you put out there, and see if it works, if it didn’t work, what will you do additionally to this to make even better, is it worth doubling down as a fix it problem, can we turn this, that initially feels like a problem that needed a fix, into something that becomes now, an amazing experience. So it’s no longer a problem, but it’s now an enabler. It’s now something that allows your customers to experience. They go, “wow this is awesome!” Then you turn that into us. And a lot of the problems that we’ve fixing product, generally speaking, we think about it as, “I have a problem, I have to solve it.” But because we have so many problems to solve, very rarely we step back and say, “Can we turn this into the life of experience?”
Sofia: “Okay, have we achieved something, can we turn this into something better?” And have the discipline to go back to the things that you release, and the features that you have, and really making sure that they’re working or not. If not, if working make it better. It’s super easy and everyone has the same problem. It’s very easy to check on the feature, check that it’s working, yeah, improve the product, move on. Next thing, and then you end up never revisiting the experience that actually make an impact on people’s life with your product.
Ashley: So how would you recommend doing that check point, you figure out a problem, you’ve looked at either how to fix the problem, or I love how you just said that, also figure out how you can turn a problem into a delight. I think that’s an amazing lens to look at things. But let’s say you’ve turned the problem into a delight that’s resulted in a new feature, now how would you recommend teams do that check back and see how it’s affecting the daily life, and don’t just do that one and done, and forget about it.
Sofia: It is not very scientific sort of approach, but this is what we do internally that works for us. We have certain principles and rules within the engineering team, and the product team in general. Engineers and products, where we are planning and looking into things and opportunities, we try to understand what proportion of our bets are new features and new things from –
Sofia: And what is a proportion of things that have to do with improvement with the current experience, and the current experience capabilities that we have. So, we try to look at it in those lenses and say, “Well it definitely cannot be 80 percent new stuff and 20 percent improvement.” We have to look at it why can we keep a balance. So, can we always be looking into the future, but never forgetting what we’ve done so far. So, it’s not scientific cause we don’t have really a number. I would be lying if I tell you, “It has to be 50 percent!” That’s not how it is. It’s just more about sensing, as we are working on this, are we doing too much new stuff, because it’s exciting, it’s easy to do new stuff, always.
Sofia: It’s very easy to get excited with new features and doing new stuff. But it’s harder, it’s just to say, you know what … this spring, we’re gonna work on search again, cause it turns out that we might be able to do this better, and it’s just justifiable because these archetypes are very important, and if we can convert more of those, we will drive x y and z goals. So, it is about that discipline. And I think that is have for the manager role, to be honest, that’s the problem managers have to be very strategic and think about, how to sort of build a culture, where this is okay. Where it’s okay to go back, and revise, and do better work on what you already built, as opposed to getting distracted with shiny things. And I’m telling you this is a struggle for us, and for every product manager that we talk to, so it’s not a perfect solution, but it’s a reflection.
Sofia: For us, it’s more a reflection. It’s just we have a bunch of rules, and one rule is, let’s have a look at the proportion and we’re just inventing here a bunch of exciting things for the future, or are we actually addressing things that could be better that we already have. That’s simple.
Ashley: No, and I think that’s a great and overlooked point, when we think about game changing, product opportunities or insights, we think about new things we can do, rather than going back and looking through it themself before to see if it’s the best solution. And I think looking at that, not that 50 percent is coming from each, or 60 40, whatever that balance might be, is a fantastic point, and I think one, that necessarily all product teams are thinking about and I think if companies are getting customer research, if they’re talking to the customers, if they’re paying attention to feedback at all, that non scientific query, yes, it’s not scientific but you’ll have a pulse on things.
Ashley: And I think that’s an, under chat talked about point as well, and I thank you for bringing that up.
Sofia: Perfect. So there, I don’t know, you have anything that you would like me to discuss a little bit further, because this a process and as I said before, it’s easy to go through it very fast, and there are a lot of nuance and little things that we do and it takes a ton of time to make this happen, but I think as a framework, personally I feel that the team is just happier, working on things. When we think about and designers, and everybody else, there is no sense of satisfaction that comes from the understanding that we are building the right things, and that we are building things for the right people, and the people that we’re building these things for, we understand.
Sofia: So, it is important that that exists there, because that’s a culture that really helps for customer centricity. Sorry. When you have people that feel really connected, with their work and how that work affects other people, and they know who that people is, normally that creates an environment where more of this experimentation, more bets, more opportunities happen. So I feel at this point, after many years of iterating on our processes and way of doing customer research, and prioritizing this in specific flow, I really hope that everybody watching this, gets some value from this process, and can maybe just pick, one thing, two things that might be good for the current sort of methodology, so the current approach.
Sofia: But this is a mix and match of many, many things that we have been learning for a long time now.
Ashley: Yeah, now this is has been fascinating, and I know it’s gonna be super beneficial. I do have one quick question and we will wrap this up very shortly, you’ve been very gracious with your time. Do bets replace what often product teams might refer to as a test, or an experiment? Or do you use them alongside a test and experiment.
Sofia: No because a bet is that. A bet is an experiment and is a test. And a bet doesn’t have to have a long description and be a big project, a bet could be literally for themes that are way larger, it might be changing the bottom, and changing the copy here, very very small thing sometimes are large initiatives, for example changing your pricing. Sometimes it’s a large feature. And other times it’s just testing something very small. So a bet is a bet. Nothing to do with the size of the bet, it’s bout you having a research question, a hypothesis, and being able to try and validate it, whatever size it is.
Sofia: Of course the size will always be prioritized based on whether or not you believe that has a big impact. But a small bet could have a massive impact anyway.
Ashley: So, if I’m understanding this correctly, and please tell me if I’m not, basically the way you guys look at bets is that’s your hypothesis. Whenever you’re running a test or experiment, you always wanna have a hypothesis, or an assumption of what you will think will happen. The key difference with bet’s that I’ve noticed, is that you have identified both having a positive, and a negative consequence, and you actually could have more than one positive, and or more than one negative consequence, as opposed to one key hypothesis or assumption.
Sofia: Yes, correct, yeah. It’s more flexible from that perspective.
Sofia: And it allows as well to understand whether or not, where other things that you want to measure. The more in specific you make your hypothesis, the better. What I like about bet, is that it helps you to think about it in, I want to say broad. I just feel like it just provides you more context, because real life is not a straight line, so when you think about hypothesis, did we validate this? Yes or no. Yes. Awesome. And, if there’s anything else that we can learn from this, how that’s gonna help us to build the knowledge with the next thing that we want to do, I think it’s just about understanding more implications. There’s no one thing that I do here, and one perfect line, that this is the outcome. Sometimes there’s more implications. So if you try to think about it in broad terms, I think it provides opportunities to catch better ways of measuring it, catch better ways of positioning the bet, maybe bringing down the bet slightly, more, and so on.
Sofia: I don’t know if that actually is very helpful, in general, but we find that when we create bets that way, it feels more tangible than just saying we’re gonna do this thing and some how this other thing’s gonna be affected, and that’s it. Scientific experiments, that’s the symbology behind it anyway, that is the way it is, and that’s how it works, and it’s awesome. I like to see those with a little bit more context, than just a straight line. And that’s all it is. But honestly, a bet is just a hypothesis, it’s just that we try to understand it with a bit more context.
Ashley: No I love that. So for you, and your product team, and your growth teams and the whole company … bets replace hypothesis in any experimentation and testing, because it’s not a straight line, you want to give more context, more color, because that’s where the richness is found.
Sofia: More possibilities. More possibilities. Of being wrong. The thing that’s what we’re looking at. When I think about this process, we are just trying to find, identify more ways that we could be wrong.
Ashley: Where do you guys store, product, your bets? Where do you guys track and store those?
Sofia: It is the same. So we have this product, sort of strategy presentation that contains those archetypes, and contains bets and that.
Ashley: This is part of that monthly, all hands per. Okay.
Sofia: And every month we are revising, so every week we have a planning that allows to see which bets are we working on, during that week. We have very short weeks, basically. One week streams. And then, we’ve been doing this for a long time, and we like it that way. And so we have those every week on Fridays, and then we have the monthly sort of meeting where we look at the product strategy, which are the bets. We take them back to the team, this is the bet that we chipped four weeks ago. This is the result we have so far. Some bets need a lot of time to gathering of data to see whether or not they work, so we keep that report and over here. Working on something, is it working, is it not working.
Sofia: And I guess that’s why I like bet. Not everything is like that, not everything is a micro interaction. Sometimes we’re working on full workloads, sometimes we’re working on completely new experiences, so you need more time to gather data to understand whether or not it works, you also need to redefine the bet, and so on. So that’s how we work with it.
Ashley: No I think that’s great. I’m just gonna ask one last question, and then we’ll take it away, this has been fantastic. If you have one tip that you wanna lead, product teams, founders, marketers, whoever might be listening here today, growth teams… one opportunity to find an insight that will either help prioritize their existing road map, or change their roadmap, what’s kind of one big take away you wild like the audience to kind of learn, from this process.
Sofia: Yeah, I think it’s just less about the process, and more about the mindset. I’ve talked to a lot of product teams that they feel that they don’t have enough time to do their research, “I don’t have a time to do interviews, I don’t have enough time to look at the feedback,” it’s just so much things going on, everything is very focused on the delivery things. But if you think about it, it doesn’t make sense at all. Cause you’re so focused on the delivery, but you’re not really understanding whether or not that delivery’s gonna have an impact, and you’re so focused on the delivery that you forget who you’re building for. And then that creates some sort of massive stress and anxiety, because they know that they’re not necessarily paying attention to the things that they should be looking at like qualitative data, and numbers, and tracking the stories.
Sofia: And that anxiety drives more focus into delivering, because maybe you can chip a lot of things, something will work. And it’s a just horrible cycle. So you have to allow yourself to have the mindset that this is important, and that will give you the confidence to build better experiences, but you have to believe it first. You have to be able to say, “Yes, customer research can help me and the business, to understand customers and define their opportunities. Therefore, we’re gonna put the time, and the effort with the entire team, to build a process that helps us learn from customers.” And that is a decision, it’s a mindset. But if you don’t even consider because you’re like, “I don’t have enough time, so tired, everything is so tired everything is happening, I have to build, build, build.” That will never happen, so you will just increase your anxiety.
Sofia: So, it’s more about do yourself a favor, step back, build a process to learn from customers, and then build on top of that a way to deliver that, and more continuous bases, as opposed to ignoring it completely.
Ashley: No I think that’s so great. So, once you take some of the process that Sofia’s showed us today, you’re never gonna get the full value from it if you don’t change your mindset first. And that’s to talk to customers and you have to be learning from customers, you have to do that on a consistent basis. I think that’s fantastic. Sofia, we didn’t get a chance to to talk about, see how you use EnjoyHQ, maybe we’ll have to get you to show us how you guys actually use your own product in the future. But, if people wanna get started using the product, or give it a shot, where can they find it?
Sofia: So, the website is getenjoyhq.com so, you can open a trial, it’s 14 days. We’re always here to help, so just give it a go, just centralize your customer feedback, build a couple of reports, understand how you can leverage that information, but get started. As I was mentioning before, being afraid of what the process might look like, and how it has to be perfect, that is not the way to achieve anything, you have to start very very small. So if you don’t have a process, just start from designing, “Okay I’m gonna pay attention to this customer source. I’m gonna pay attention this interviews, I’m gonna pay attention to this small thing and build on top of it.” So, in terms of what we can do for you, connect your feedback sources, organize your user data, build research projects around your customers.
Sofia: And you can do that on our website, getenjoyhq.com.
Ashley: Perfect. And you have an API, but you also have a lot of automatic integrations with things like intercom, and delighted, and things like that that make it really easy. And what I love about it is it helps aggravate all of those data points, from all of those different sources, by individual customer or user, and I think that’s one of the true values in it as well. Shameless plug, shameless plug here from us.
Sofia: Thank you.
Ashley: Alright, well thank you so much for being here today, I know we went a little bit long, but there was just so many great things you brought up, I couldn’t help but dig into them a little bit further. I know this will be really valuable, so thank you so much for sharing.