Richard oversees Coursera's infrastructure and product development. Prior to joining Coursera, Richard held various engineering leadership roles at the early days of LinkedIn, with a key focus on scaling the Jobs marketplace and Talent Solutions to become its first billion-dollar product.
Richard also oversaw the product development for LinkedIn international expansions. Prior to LinkedIn, Richard spent over a decade at Microsoft leading various product development teams including MSN Hotmail, Active Directory, Windows Server, and System Center. Richard received his Master’s degree from Stanford University.
"And at one point we were too reactive in thinking about quality... We just followed the old way of like ‘Keep shipping! Ship very, very fast!’ Without thinking too much about the experience and the quality.
And very quickly you hear these customers, they shift the conversation from ‘they want more features’ to talk about ‘their frustration on our platform.’ To the point that it became a major business risk for the organization...
...So the statement that we defined, at Coursera is... ‘People should be able to run as fast as possible, as long as you meet these three quality objectives'"
- Richard Wong
Mesmer's AI-bots automate mobile app accessibility testing to ensure your app is always accessible to everybody.
To jump start your accessibility and inclusion initiative, visit mesmerhq.com/ELC
ELC's Peer Groups provide a virtual, curated, and ongoing peer learning opportunity to help you navigate unknowns, uncover solutions to your challenges and accelerate your learning with a small group of trusted peers.
The job is hard. But you don’t have to do it alone!
To learn more and apply - Click HERE
Patrick: So first off, an official welcome to the show, Richard. Thanks so much for joining us on the podcast
Richard: All right. Thank you, Patrick and Jerry, thank you for inviting me to be here.
Patrick: So to kick us off our community asks us all the time in a lot of different conversations that we have the big question of how do I navigate speed versus quality? And this comes from people from all different levels of the company.
And what makes this conversation special with you Richard, is your experience with Coursera. As they say, in something like baseball, you've pretty much covered all of the bases from going from zero to one to finding product market fit, to scaling, to recently IPOing. And so throughout all of these stages, you've had to navigate sort of the shifting dilemma between speed versus quality, which is very, very tricky for people.
And so we've planned to talk about a couple of those different phases. So to begin, I was wondering if you could share a story maybe that, captures this dilemma for you. Is there a specific moment or a conversation from your time at Coursera that has embodied this dilemma for you? And can you tell us that story?
Richard: Yeah, absolutely. So, first of all, thank you. This is a great question. I wouldn't say that like speed and quality Are diametrically opposite to each other. In fact, for many companies, is part of our job description as engineering leaders to design and optimization, a process or system that allows us to move both speed, and quality at the same time.
But from time to time, as the business evolves, we may be falling behind on one side and we need to shift the balance to the outer side. I often think about engineering or engineering processes like designing a traffic system. So think about a typical traffic system that when you actually go to El Camino Del or on 101 they designed a system to try to optimize for two things.
Right? Number one is to maximize the amount of traffic flow on the street, right? So everybody can get to their final destinations with meaningful disruptions. But you also want to make sure that there's some level of safety, right? So maybe they probably can move very fast, but you don't want everybody dies right At an intersection. They collide with each other.
This is pretty much what happens in the engineer organization. On traffic system, define the set of rules on like you drive on the right side of the road. You flash your signals before you turn and you move when the light is green and stop when it's red.
In engineering, you do same thing, except the type of design is slightly different. So you have design with vehicle, real unit testing, setting SLA for your surfaces. That's what we call kind of processes here.
But even in the real world, the traffic system is not static. Right?
So think about this, right? If there's a city, if you actually live in a town with only a couple hundred people, there are seldom there's any cars on the street. Usually the amount of regulations or restrictions are pretty limited, right? Many of the intersections probably uncontrolled there's no speed limit.
Why? Because in that kind of settings, I mean, the stake is really low in terms of that the chance that he causes an accident that has like big damage to anyone.
That is the same thing I think for an early stage company that when I joined Coursera, Coursera was a Series B company. We were much smaller than where we are today. At that point, what you basically do is just, you want to run fast, you allow people to go in different directions and try to figure out as much as possible on, you know, what is needed by customers.
In fact, the reality is like 90% of the startups, they run out of money before they get the right idea. Or even for successful startups, right? 90% of the ideas don't work out on the first trial. So you better not to spend too much energy or time thinking about scaling or building a perfect product to start with because most of these ideas will get thrown away.
But you will know that when things change right over time. When your town becomes more populated, at some point it becomes San Francisco, New York City... Then you have a much more limitations, right? You have a stop sign, you have traffic lights, every a hundred yards.
You put, in an emphasis to maximize the balance between speed and safety. So I think in that growth company, like what Coursera has been going through is constantly about like redesigning our traffic system, like in the city. Maybe when the traffic doubles every few years when the number of people double every few years.
Sometimes it's more evolutionary delivery, but sometimes it's because there's an accident. There's some issue... big issue happened that pushed you for that change. Right.
Richard: So you asked about like, what are some the example? I mean, we definitely have that kind of challenge about three to four years ago when we started to scaling. So when I say, when we start scaling, it was a situation that we had built our enterprise business.
We have built a degree's platform on our system. So at that point, unlike the early days, many users were using Coursera just as a hobby, right? They enjoy the learning. They just come here whenever they have free time. Now, Coursera actually became the platform that our enterprise customers are relying on to uplevel their employees and organizations. And universities are putting degrees on our platform and final exams on our platform.
It was a turning point for us. Now, these customers, they have much higher expectation on our product functionalities and qualities. And at one point we were too reactive in thinking about quality.
We just follow the old way of like keep shipping, shipping very, very fast. Without thinking too much about the experience and the quality.
And very quickly you hear these customers, they shift the conversation from about "they want more features" to talk about their frustration on our platform. They said, "When I try to do this, it fail it errors, it doesn't work. It give me incorrect information."
To the point that it became a major business risk for the organization major business risk for the organization.
Each quarter at a company, we actually talk about a highlights and lowlights, right?
Three, four years ago, there was a couple of quarters... the major business risk for our company was because of these kind of frustrations from a customer about the quality of a product.
It was definitely very stressful as a head of engineering and try to manage through that situation.
But the fact that across leadership team, a strong alignment was skewing to shift the balance to more focused on quality side was a great outcome that I can't ask for more.
So once you have that, it's much easier for you to actually drive the level of execution across organization to fix the problem.
Jerry: I'm curious to learn what changed that eventually helped improve the quality of the products. Is that prioritization. Is that amount of resource? Is that something else
Richard: I think it's a combination of all of them. So if you definitely need to start with a high-level alignment among the leadership team. I mean, here's the truth, right? If you think about the growth of a company, there are many, many dimension and factor that impact and support the growth of the company.
Sometimes the limitation may be on the sales side, like the head of sales may be running into, you know, challenges or big opportunities for the company. And sometimes it can be on marketing side sometime it can be something else. Right.
But in a case of quality, the people with responsibilities, definitely fall within the product and engineering organization on how we changed the game by showing to our customers our quality product.
Now in that particular situation is so important as head of engineering or any engineering leader, to be able to explain to your cross-functional partner on why this is important.
So majority of the time you will think that, oh, of course, I mean, who doesn't want the better quality of product?! And everybody wants it. If you ask anyone and say, do you want us to have a better quality? No one will say no to it. It's just about the trade-offs.
It's about like, well, if we need to improve the quality of that, that means that we need to delay or stop building these particular feature for the next quarter, until we fix the underlying issue.
That is where the challenge has happened, right. Is about like, okay, we need to stop something in order to give way for this.
I think building this sort of alignment is so important. How do we get the alignment? First of all I think every leader in the organization should be getting the basic facts and the information correct. So lots of times when misalignment happens, because people don't share the same set of data or information.
So for example, maybe we have like 10 customer escalation in last week from my top customers that I'm aware of. But if our head of sales is not aware of that, then she'd definitely say no, no, no. Like why did we spend too much time on quality? Right.
So maybe sometimes it's as simple as like, do you know that your top customers say they are very frustrated with a platform that you may not be aware of that, right? But sometimes even if they are aware of this information may still come down to different judgment. Because there may be some information that I am not aware of, or maybe some experience people have that in the past, will lead them to a different conclusion.
So I think the first starting point definitely is about like building the set of alignment in organization by sharing information.
There will be cases that the judgment was still be incorrect. That is the time that you need to escalate and try to get a clear decision across organization about priority.
I think in our particular case it was pretty obvious about three, four years ago when we experienced this at a problem in organization. when we get the escalation from our customers. across the leadership team, we all agree that we need to address that particular problem.
So that is the first step.
Richard: But after that, I would say there's some tangible and technical steps that we need to take I mean, together. Not just about improving quality, because quality is a very, very ambiguous term. I'm sure if you actually go to ask your engineers, your cross functional partner and say, okay, should we improve quality?
They would say yes. But what do you mean by quality? They will probably give you like 15 different answers on what quality means. Right?
Sometimes, I mean the most obvious probably is about like bugs or issues, but sometimes it's about like the usability of the product, the user, and like how smooth the experience is. Or the performance of the site.
And sometime actually, when some customers say that the quality of your product is not good enough, they actually mean that you're missing some features as compared to your competitors. They would say that, oh, the quality of your product is not good, but the actually means something else. Right?
So it's still important, I think in organization, Jerry, your question is about like, what do we need to do is to define clearly about the objectives. When we say we need to improve quality, what objective functions we are trying to optimize for.
At least at that case, of course there are um, when our customer tell us about, we are not up to the standard or expectations to meet the growing needs for their critical business usage. It was pretty clear to us that the issue at that point was about, you know, the functional defects that we have in our product, right?
So we have bugs here and there across our system that did not meet the expectation...
So the first thing that we need to do okay, we need to agree on fixing quality issues as a priority.
The second thing is that of the quality issue we know that the most important thing is to reduce the number of functional bugs on our system. Which we probably defined as the number of P0 and P1 bugs that we got reported every month.
And then after that, you need to set some objectives and measurable goals to proxy that progress.
So for example, when we measured the last quarter, we see that every month we had about a 20 P0 which was what caused the frustration of our customers. And then we say that in six months, we want to reduce from 20 to become five. So you need to set some goals to actually encourage your team and incentivize your team to pay attention to it.
And then once you have defined these kind of standards that is much easier! Because at that point, I don't even need to be in person to solve the problem because probably I'm the dumbest person in room to can solve that problem. People inside the organization, they are much closer to the problem. They understand what is working, what's broken. They will come up with creative solution, whether it's about changing processes or changing technology to do that.
And of course, I think throughout that process, we need to inspect every single thing. Like we set up a test organization, we build test framework. We have post-mortem meeting. We have retrospective meeting every week to review all these bugs but those are secondary.
Once you have that kind of system and structure, even if you have a system that does not work on the first trial, your team of engineers and leaders will be able to quickly find new solution to address those issues.
Jerry: So, what I hear is first you have to have the information parity across different teams. to have a common understanding what quality really means. And then it's the alignment of priority and later on the, the process that come into place, that that ensure people have something to hold on to. So they iteratively will improve.
Richard: Yeah so I would say that that is mostly my performance review, not the team's performance review on, on those numbers so that I am super motivated to actually fix the problem. Um, but I think that is actually when we tried to set these goals, the primary motivation for me is less on, kind of performance evaluation.
We set business goals for our company. We talk about number of users. We talk about revenue, we talk about renewal. We talk about utilizations for our product. If this is something important, then we talk about it. This is something that we talk about in an all hands meeting. This is something that we talk about in the in team meetings.
And this is something that not just, I hold myself accountable. Like I published that in my OKR, in the company. I say that we need to hit these kind of goals, but also ask my peers and say that, "Do you want me to hit this goal?"
That goes back to the point about the alignment, right? Like I want to hear from you clearly about like, whether this is the most important thing that you want me to accomplish and help the growth of the business, right?
Head of Sales, Head of Marketing... tell me if this is something that you really believe that we should be fixing. If, yes, I think it's clear. Like we need to work together and try to drive to that outcome. But if I'm talking about this in all hands meeting, every single meeting that we have. And I pay attention to these numbers, I think very quickly people throughout the organization, they will understand the criticality of this problem. And then they will be self motivated to try to drive and find solutions for that.
So that's how I see it.
Jerry: So it takes time and repetition to have people that really understand the importance of quality. as a leader the patience or the willingness to repeat to, ensure that the message delivered. Especially if the team is large.
Richard: Yeah, I think that applies to pretty much any changes in an organization, right?
So organization has inertia. They have been following certain set of rules and processes. I mean, go back to that traffic system, right? If you're driving down the road every single day to the same destinations, you probably have developed some habits on how you navigate around this stop sign or red lies, or kind of master your way to actually achieve your destination based on that set of rules.
But if all of a sudden you detour, you change the way how, you know, you drive on the road and you have a slightly different directions. It will take a while for people to get used to that. And sometimes people get frustrated because they don't know whether that the tour will lead them to the final destination. They get frustrated because probably they just get slowed down at least for a period of time. Right. They have no idea if this is going to be better for them.
So it's sort of important for us as leaders, whenever we drive any changes whether it is about like pushing for speed or whether pushing for quality, we need to constantly to radiate that positive energy in the organization, explain to them why this is important and also show them progress.
I think when people see their work translate into progress. Translate into a result. And there are quick wins. I think it's much easier for people to continue to commit and push for, you know, further progress.
Jerry: One observation I had past, I mean, my own experience is that sometimes when customers or users have a lot of complaints about the quality of products is actually a good thing because it just motivates people to do better. And can be a trigger for a major change that it would otherwise hard to convince people to make.
What's your take on that?
Richard: Oh, I absolutely agree with that. I think that is still my bias as well. for any engineer organization for any you know, highly talented engineers, they take pride on what they achieve.
Every engineers they said that I want to build a product that lots of people want to use. I want to build a high quality product. There's a lot of craftsmanship put in place. It's highly scalable, highly functional best performance. I mean, this is all true. We all want to achieve that.
But in reality though, at a very early phase of your product, you don't want you to over-engineer your product until you have some customers that are really using your product.
I mean, I've learned these lessons in multiple companies that sometimes because of our pride on saying that, like, "I want you to actually do something very scalable."
We try to build something. Well, we did try to build something very scalable that can actually take in millions of users. But when you released a product to the market, like five people use it and then it never came back...
It happened all off us before. So I think it's still important for us to actually tune our strategy on what to focus on at the right time.
So that means at the very beginning, you would actually want to spend all your energy to innovate a new product to the point that some customers are starting to give you feedback. Whether it's good feedback or negative feedback, actually both are fine, like if people are giving you negative feedback, you know what it means?
They basically say that, wow, I really think that your product is trying to solve a problem that I'm experiencing or I'm facing. That's why I'm willing to use your product. And when I try to use your product, I ran into these kind of issues that I really hope that you can fix it for me so that I can continue to use your product. That's actually what he means by that.
The truth is that if your product does not provide value, well, you don't even start, I mean, all of us probably get a lot of spam email every day. Like we don't dare to even look at that and try the product to start with. Or sometimes after we tried a product, we know that it does not even solve the problem. Like it's not remotely solving the problem that I'm facing.
Are you going to provide the feedback to them? No. Like you would just ignore it forever, right? You're not going to use that product again.
So getting some feedback from your customer is actually a good sign. I will always choose this path and say, we build something. Our customers want it. They tell us something that we are not doing good enough. I'm not saying that I enjoy actually building a crappy product. I'm saying that I enjoy building a product that our customers... they said that you need to do a better job so that we really spend time on building it and meet the expectations.
So that is still the primary bias, I have.
Now even at Coursera, we have iterate through from the very beginning, from the zero to one phase. Now we are still growing. I would not say IPO this is a final milestone. We still have many, many years ahead of us, to continue to advance our mission to actually be a great product, to serve much more users.
We still try many new things to get us to a 10 X or a hundred times, right. In terms of the reach of our customers and our learners.
I think in many, many cases, I still bias towards pushing on new functionality and product and optimize for speed. As long as... we meet some basic expectation of the quality of the users.
Patrick: Well, I was just had a follow-up question for how you formulate sort of that shared understanding of what is that basic expectation? Because I think some of the dilemmas that different engineering leaders have shared with us is... there's tension at different levels where some, either frontline managers or maybe director level managers are sort of seeing signals for, "we need to optimize for quality" or as defined as like, you know, removing features, bugs, creating consistent user experiences.
Then you have executive teams that are optimizing for, you know, how do we build new features and enroll these new things in new lines of business. And so there seems to be a mismatch in terms of the consciousness for how you should make decisions.
And so I was just would love to learn how you can help bridge the gap and create that shared sense of understanding for what that basic level of expectation is.
Richard: You know, I think this is great question...
I think they are short term things and long-term thing. So let me explain the long-term thing first, right?
At some point I think the executive team and the leadership need to achieve the trust relationship between different functional leaders and saying that like, "Okay, if you are a head of marketing, then I trust that you are doing the right set of things to run the marketing events or business or methodology. Like I can provide some input and feedback to you based on what I know when I see what I observed..."
But I think I have to trust that the process the investment over there on their marketing systems is the right thing. It's approximately the right thing. And you are doing the best thing that is possible for the company.
So it would be the same on the engineering side. You need to get to the point as engineering leader, one of the most critical responsibility, it's not just about like driving execution, right? It's actually to represent your engineering team and gain the trust from your CEO and your peers and saying that "Richard, you are mostly making the right decisions for the company."
And to the point that they don't need to spend too much time to evaluate your decision every single time they basically say, "Okay, it seems that you are delivering your product. You are driving business, driving technology innovation. But you're also addressing a lots of the basic technical debt or quality issues or architectural issues."
Over time, that's how you establish a system that allows you to have some freedom to make that adjustment within your team, right?
The people that are most familiar with that technology problem is people sitting in your team. Your engineers, your engineering leaders in your team. So they probably will be able to make the best judgment about the level of investment. But you want to set the relationship to the level that you have that freedom to actually drive that level of change without frequently get questions or challenged on that level of decision.
So this is long-term that every engineering leader, they should strive for this. Now short-term how do you actually get to count that reconciliation? I don't think there's a magical formula that if you actually get us to that outcome,
Patrick: It would be. It was a magic formula,
Richard: Yeah, I think the first thing is, well, I almost think about like improving quality is like building product. How do you think about building product? You set some goals and then you try to iterate towards that goals. And then when you achieve that, sometimes it works and sometimes it doesn't work and then you evolve and change over time. But you just need to set a clear expectation I think to you know, people around an organization about what we are trying to accomplish as an organization.
Go back to your example, I was talking about, I think the first starting point is not agreeing on whether we need to have a test team or whether we have a test automation or whether we have, you know, unit testing. That is not the things that people need to agree on! People need to agree on how does success look like at the end. If people can not agree on how the success looked like, then they will never agree on the methodology.
So I think that as engineering leader, when you say Okay, I struggle because my frontline engineers has different opinion with my managers and has different opinion from mine, has a different opinion with the executive team, maybe the first thing that you should do is actually put a stick on the ground and say that "When we solve this problem, this is how this is going to look like."
Maybe the uptime of the system, maybe the number of bugs, maybe the number of customer escalation. Like people need to start agreeing on that front first. And everything else will follow.
So at least that's my experience.
Jerry: Could you provide example of when you have that conversation with your executive team or the cross function leader?
Richard: Yeah. I mean, it happens all the time. It happens all the time, sometime it's because of a crisis, When I say in a crisis, that was the story I share about like three, four years ago when, "Okay, we got lots of new business, we got lots of new partners and all of a sudden, they all tell us that, you know, product sucks..."
So those are crisis. And that one actually is easy. even without me actually telling my CEO about this, he actually will come to me. Because you know what, this escalation email goes to him. Customer actually sent him an email and say that like, we don't want to use your product unless you fix the problem. Right. So those are actually pretty easy.
I think majority of time is just actually a shoulder level of confidence. And you tell them " We got it. I understand the criticality of that. Give us three months of time and these are several things I'm going to do. And you will be able to see that outcome."
I think is really to project a sense confidence and your will to address the problem. That will mitigate a lot of the issues. Of course you have to deliver at the end. But I'm pretty confident about that part in my organization.
I think the more challenging part is about when there's no clear crisis. So we all know that we have, you know, technical debt in our system is slowing us down on daily basis. Or sometimes it's much more difficult for engineers to actually get their job done. Or they'll easily get page at like 2:00 AM because of some issues that happened in a system that we have known for a while, but we just didn't feel that we have time to do that.
Those are actually more difficult conversation because when you actually... well, first of all, your CEO is not going to go and call you and say that, "Hey, what are we going to do about this?"
Right? Because those are more hidden issues. Right. But your team is suffering, but it does not actually propagate to the top level.
My conversation actually primarily with my product leaders at that point is, if you think about quality planning, what we do is about like you lay out a set of business and product priorities on what each team needs to deliver to kind of achieve that business goals.
I mean, that is a pretty typical process, right?
Every quarter we set some goals and say, we do project ABCD. And at the end we'll have like 5% more revenue or users or something like that.
And then we do a pretty rigorous prioritization, of course the product managers have very big influence on that priority within that list because they understand the need of customer.
They understand business strategy and engineers will provide input and tell them how easy, how difficult it is to do and some alternative suggestions. Right.
But when I have set a contract with my product leader is saying that like every quarter Trav, Trav is my product counterpart. I need to reserve 10% of my engineering time to actually solve technical debt issues within the organization. Don't even ask me what those projects are... we'll, we'll definitely make good use of that time. And we will definitely show some good results, like coming from the engineers about, the kind of problem they can solve in the organization.
I think that really actually empowers our team, having that level of trust and agreement with him and with the rest of the organization. They allow the engineering organization to really focus on saying that, okay, how do we best utilize this 10% of time to solve whether is a productivity issues or bugs or on-call issues. And I think that's the way that we drive it.
So of course there are sometimes there are give and take, right? In some quarter, maybe that 10% will become 8% because there are some major deliverables that we need to hit for, partners and learner and customers.
But there are also a period of time that their product team, they may be, still thinking about a longer-term project. So the engineers will have a little bit more space, so they will expand that 10% to 15% of the time.
So over time you establish some trust and system in the organization that will help you to manage that not as like ongoing every two weeks, you need to talk to your product person about like, "I need to improve the system."
But there's an ongoing contract, like one time that empowers both product and engineering to maximize for their productivity and maximize for their experience while developing the product.
Jerry: That's really valuable to know. I believe that 10% baseline agreement, not only give you a freedom on the engineering side to work on things that feel important. But also to some extent prevent the outer side of problem that focus too much on quality actually over-engineering the problem.
Jerry: Which transitions into my next question, is that, how to prevent people, spending too much time, optimizing for quality
Richard: This is not easy. Again, I go back to the point I was talking about like, we all want to do the best job, right. And a lot of time engineer prides himself on kind of the technical sophistication on the work that we have accomplished.
Right? So we designed a system that can process a billion request this per hour, right. We designed a system that can, you know make it like two milliseconds faster. Like we all pride on these kind of achievement on the skills that we can demonstrate to others. Right?
So one of the most important thing that I want to help my engineers, is if we achieve that kind of results... if we complete this project who will care about this? Like if the answer is not users, it's not learners.
It does not mean that we should not do that, but that should be the first question that we ask is about like, okay, aren't there technical improvements or architecture improvements that will bring actual benefit to our users and customers?
So you thought like Jerry asked about performance... sometimes performance is very meaningful. Like for example, we have over three quarters of a users are coming from outside of United States Today I think if people have like good broadband connections within United States, I think the performance is very, very good. I would say we've done a pretty good job on optimizing for that part of the stack.
But if people are sitting half a globe away they're sitting in Asia or maybe in India and China or out of country. So it's half a global away from us, I think the performance is not that great. When the performance is not that great, engineering, sophistication or improvement of optimizing for the extra one second and two seconds becomes very important. That's what we've been working on actually for our last six months or so is to improve the performance for international users.
So if that is a perfect combination to say that good engineering challenge, but also solve a very key business problems. So that's the best combination of that.
So what we want to remind people is let's not try to solve... like as much as possible... let's be very careful on try to solve a technically challenging problem without clear benefits to the user and the organization, except that it is cool. It does happen. I'm guilty of that a lot of time, I just like throw this idea to the team about like, well, let's rewrite this, let's try to improve this.
Let's build this cool system. So I have to remind myself and ask like Korea culture in my organization, people ask about like, what is the benefit to the user? What is the benefit to our business? Right. That should be the starting point for people to be able to answer.
And, you know, I think over time, once you have asked this question enough, it train our people to identify what are the top priorities that we want to focus on.
That has the best combination between these two space.
I have a strong bias on the concept of not reinventing the wheel at Coursera. So they are something that we need to invent. We need to innovate. That is the problem... Like, let's go back to our mission on empowering people to use education, the learning experience to transform their life.
That is an area that we want to invent and innovate a lot. But we don't want to spend too much time on innovating on let's build our own file systems. Let's build our own database. Let's built on application framework, right? These kind of things are solved the problem in the industry. We should leverage the open source community or even contribute to the open source community to get to the best solution without spending time on just like making incremental improvement or reinventing something internally.
Oftentimes those are over-engineering and it does not bring any benefit to our users and our customer nor the organization itself.
Patrick: You very vividly described the moment that was really clear for Coursera to need to focus on quality and resolve customer issues.
How does that conversation change within the different stages of a company? Are there other sort of, maybe not obvious signals... but signals, people can look out for that indicate you need to shift more on on rapidly building features or shift more to quality.
Are there certain things that you're looking out for as an engineering leader that are helping you build a sense of direction for where you should be taking the organization?
Richard: I think there are two types of signals, right? One is the external signals, The other is internal.
So, I mean, I talked quite a bit about the external signals. I mean, the best way that you actually can tell is if you are not getting any feedback from the customers, don't even try to improve quality because nobody is using your product, right?
Try to be a product that works that people want to use it to start with. And then when you have a little bit complaint from your customers don't overreact to that. I mean, you should listen to, and you try to understand it. But don't overreact to that to pause all the innovation that you have, right?
So you need to be very careful on thinking, you know, how to effectively try to collect information and data and feedback from your customer.
I will see that this is still the best thing though, is that we have. No matter which phase of your product. At the very beginning, by default, you don't have that much feedback from them, right. From your users and customers. So you should focus a lot on speed.
The story I shared about three, four years ago, I think that was when we were starting to grow and starting to scale as part of a company, the journey of our company. And we started hearing the pain from them. I think that was a very important part of our lessons and journey and as leaders, I think that is the most important signal that you listen to.
Now, even right now we have overcome that, crisis right. You know, once we put in effort, we spent six months or nine months just trying to rebuild on lots of system in place. And the conversation went back between our customers and us went to something else. Right. No longer is about the frustration with the platform quality.
So it actually provides you some feedback mechanism about like, okay, the balance is now reached. You need to pay attention or shift the focus to slightly different ratio. It's not something that you stop right. Once you actually reached this stage, you need to maintain the status quo at least. The bar is now here and you need to maintain it. And you will constantly calibrate to see what are you still on the right path, right at that point.
But internally I mean, there are lots of things that you can hear from your own employees. When I talk about own employees, engineers definitely is our own employees, but you know, my starting point actually will be talking to our customer support team.
I enjoy actually talking to our customer support specialists and try to understand about how much they enjoyed that job or hate their job. They're very talented people and they have a lot of passion about it. But they hate their job when they constantly hearing, you know, someone called him or emailed them and talk about how frustrated that they have with our product on a particular issue that they have reported to the engineering team, but nothing has been done to it.
Again, like they really want to help the company to be successful, but sometimes they get frustrated about like, "Okay, many people complain about it, nothing is done. Richard, is there something that you can do about it?
So people inside our customer support and they are probably closest to our users They are consumer users. Enterprise customer they probably will be able to reach out to your CEO and talk directly to your CEO and LTV, right? If they represent that company.
But a lot of the consumers, they probably do not have the same way to actually provide feedback. A lot of times is they file a ticket or they actually call the customer support. So this is also a very good signals for you to actually understand about their frustration or, how they feel about the product.
The other thing that I like to do in the organization is this is actually based on some of my experience I had many, many years ago when I was in Microsoft a long time ago. We build this enterprise product. I've worked on this product called Active Directory, which is in Windows.
We were building a new tools. They're supposed to help this customer through solve certain problems on adding machines with an active directory. We thought we built it for a very well. we thought it was so easy to use internally. I mean, we have some debates on what is the best way to do that, but everyone thought that we did a fantastic job.
And you know what we actually put this product into a resource section, a session, and then we actually invite some of these like experienced active directory administrator to come and try and see like let's perform these tasks, using this new tool that we built. And we record the session on how they do it.
And we see that they struggle in achieving and accomplishing the tasks. And at that moment, all the engineers do not debate anymore about how awesome the features that we have built Was People just say that let's file a piece Roebuck and go and fix the issue because our users, even the experience administrator could not figure out how to use this new functionality...
So I think that that experience actually taught me a lot of things. Sometimes you can look at the number of bugs, you can theoretically debate whether something is good or bad, but nothing is better than like seeing how your user uses product and show it to your engineers. I don't think any engineers will feel proud when they see this kind of situation.
So it's also a good way to inspire our engineers, to, you know, get some signals and see where their product is. Is an awesome product or something that needs to be. improved?
Patrick: That's a really powerful perspective to share for how to inspire your engineers and sort of light that fire and create that sense of motivation and momentum to solve that type of problem.
Patrick: I had another question Richard cause I think one of the unique perspectives is you sort of have the whole long tail history of Coursera through a couple of different phases. And so I was just curious, like how the speed versus quality conversation has changed in the, maybe the post IPO stage?
I know you mentioned earlier that the fundamentals of it haven't necessarily changed. But I was wondering if there are any sort of nuances or details for how the conversation might be different? or how you all maybe arrive at those priorities now at this stage of the company.
Richard: I think it's probably less related to whether we are a public company or not a public company.
I think it's just a matter of scale. At a certain scale you try to focus on different types of things because the opportunities and the risks associated with business evolves over time. Right?
But over time as the company scale, you need to think about different aspects of quality, right?
So again, quality has many different meanings. Sometimes it will mean bugs, sometimes it will mean performance of their site.
But security, privacy, those are our level of quality. Like that is where the customers actually look at that and see that whether you missed a basic expectation on the attributes of your product. Many customers, especially enterprise customer, they basically have very strict regulations or expectations on compliance standards that you have achieved or guarantee on certain level of protections of their data.
So as the company becomes bigger and bigger, some new dimensions will come out. So for example, in the last I think one to two years... we put much more emphasis on the areas on improving the overall security postures of our development process, of our systems. We definitely think about, in the early days, when everybody had access to every single system because that's what is needed for engineers to actually get their job done. And probably that was the quickest way for engineers to get their job done.
But right now we have over 250 engineers. We need to create some structure to allow people to access to the system. They have permissions to access. And they need that to get a job done. But we probably need to isolate the systems. Like for sensitive operations we need a separate aisle and for people who have the business need, who are the experts on that to operate on those system.
I think a lot of times it's not even because we don't trust people, in our organization. But I think the basic thing is like we have inherent trust in all the engineers that I have in my team. But sometimes bad things happen is because like their system get compromised and then people can access through their system to access to certain type of data. So this is our responsibility to think about like, how do we protect our users and how to protect our engineers.
So when you ask about like over time naturally, you don't spend that much effort that for beginning to think about it, all these issues about privacy, security, quality, performance. You just want to get the things out and people try it.
And over time gradually we'll have like some new things that come out. Like it can be functional bug it can be a scalability of your system. It can be internationalization, that because you have many international users... Some of these things about like security privacy, definitely take a front seat to pay more attention to.
Jerry: I have another question about run quality versus speed...
One of the categories, questions you really got on this topic is how do you separate or allocate in terms of time or in terms of people for the focus on quality or to focus on speed? I know every organization can be very different. But at Coursera, how do you balance the two?
Richard: Here's the way that I think about organization and people's responsibility.
So first and foremost, I think all of us exist in the organization is because we want to help the business to grow and advance the mission of the company. I mean, that is first and foremost. And people need to think about what is needed for them to achieve that goal.
for example, if we want to double that number of users, we want to double the number of enrollments. these are the things that we, wanted to do. I mean, there's still so much work we can do to improve a business. That is still the first and most important thing for everybody to think about.
and that's why our customers actually use our product. I mean, they don't use Coursera or buy Coursera or enroll Coursera because we have the best quality. The starting point is about like, because it's solves a problem that they have facing. Maybe it's because like they need to up-level themselves, they need to re-skill their employees, or are they want to actually get a new job.
Like that is a primary reasons that people are coming to the Coursera, not because you have like two bugs right? People don't think about it this way.
So I would say that like, if the way to actually make the business successful is to create much more innovations, Then people need to put their bias towards building more new functionality and solving the problem for our users.
But there are times that we know that the fundamental blocker is because our customers and users are frustrated. Our learners are frustrated with our systems. Then they need to slow down and try to build better architecture, better quality so that we can continue to move on and continue to serve them better.
I think that meta-level that's what everybody needs to think about.
So now technically, how do we implement that?
So the statement that we defined, at least in Coursera is... people should feel to run as fast as possible, as long as you meet these like three quality objectives."
So we have some minimal bar about our quality objectives. Like for example, availability is one thing that we measure. So each teams, they own a set of systems to serve the learners. In each of the systems, they have a set of APIs. And we measure the availability of these APIs. Like the error rate of these APIs. So we basically say run as fast as possible, as long as the availability of your service is more than 99.95 percent on an ongoing basis. So that actually means if you are 99.96, then don't spend much more time on the improving the availability off your system. That means you are at approximately the right balance on what you want to achieve.
Like go and run and build as many features as possible. But if you are right now at 99.90% of availability, that means you probably off balance. That means you run too fast. Like you run too fast, maybe you're skipping certain design review, you're skipping certain unique tasks, you're skipping some scalability implementation in your product.
Like end result is that you have more errors than what is acceptable. So that means your team gets slowed down.
I find it a very effective mechanism because again, like when you have a set of very talented engineers in your organization, they don't wait until a crisis happen. Right? Like a lot of times they are very smart, they can tell that like, if I try to do something there will be a consequence two months from now.
And nobody wants to get to the jail that say that, okay, other way's too low that now I need to stop working on building new, cool, innovative features for two months. Like nobody wants that. Right.
And the way that people get to that outcome is they see, okay, this is the bar? If I know that I'm going to go down here, I better actually improve the architecture gradually over time so that I'm still always staying ahead of it right instead of going below, then I'm in jail. I think that's the way that I help my team basically is run as fast as possible. As long as you meet these three metrics, that is the basic quality measurement of our product.
Just to be clear, these kind of quality measurement is not identical across all teams. I mean, we all understand that there are certain product function areas are more critical than the others. For example, like payment probably is the most sensitive one. Like people care about a transaction. Like it is a disaster if we charge people five times for the product, right?
So these kinds of things that you cannot go wrong and you need to have very high standard, maybe in that particular case, the number of bugs or availability needs to be much higher than other services, right? We call tier one services, right? Some of the tier one service need to have much higher standard than some of these more experimental thing that even if there's some error, it probably does not disrupt the workflow of a user or does not give like a terrible experience to a user.
You have some leverage on controlling that for this team. These are the standard you need to achieve. For the other team may stick with a lower standard, but it empowers the team to run faster, to try more different things.
So that is how we actually structure the organization to strike that balance between speed and quality.
Jerry: Love the approach to answer the question to a meta-level first and then zoom into the tactics, and also the flexibility of that. And also the simplicity of that methodology. Well, the thing about that, the traffic system I talked about earlier as an analogy, right? Actually that's pretty much what it is. I mean in a traffic system, they basically get you to the point say, well, one, you can drive 65 miles per hour. Sometimes people speed as well.
Richard: As long as you follow a set of rules, in an organization. And those rules actually can help you to reduce the accident rate in the organization. So I'm sure if lots of people get into accidents, they will change the speed limit from 65 to become 60.
I mean, that's pretty much what happens, right?
That is the same actually within the company. So you want to make sure there's not a lot of people die, collide with each other, or cause major disaster. But other than that, our job is still to enable our business to run as fast as possible and serve our learners the best way possible.
Patrick: That was an incredible, incredible concept to sort of wrap up this part of the conversation, Richard.
We have a couple of rapid fire questions we wanted to try to squeeze in before our time concluded. So if you're ready, we'd love to transition to a couple of those questions for you.
Richard: I think I'm ready.
Patrick: All right. Perfect. Okay. So rapid fire question number one, what are you reading or listening to right now?
Richard: I really enjoyed Reid Hoffman's Masters of Scale podcast.
Patrick: So number two, what tool or methodology has had a big impact on you?
Richard: You know, as a leader, I really applied as to I called a PQPA that is precision questioning and precision answering. Sometimes the organization, we have a lots of many different things that's happening. And sometimes it's really hard to actually get to the core about like what people are talking about? what is the exact problem that you're trying to solve.
I find this tool incredibly useful for me to try to get a quick insight about the major, the core of the issues and try to offer help as needed through the organization, by removing the noise around that topic.
Patrick: So number three, what is a trend you're seeing or following that's interesting or hasn't hit the mainstream yet.
Richard: I'm really interested in this new artificial intelligence technology called GPT3. So I've seen some demos of it. It's pretty fascinating. If you haven't seen it before, basically it's a way that it crosses lots of articles on tech material on the internet. but in a way that like, because of the advancement of natural language processing, now it can mimic human being.
Not just about by understanding something. It can even create article. If you give him some hints, it'll create articles for you.
I think historically when we think about AI technology, it is a pretty binary or mathematical type of things where it tells you like, whether this is a cat, this is a dog by showing the picture. Right. But now I think artificial intelligence is trying to get into the creativity space, which is pretty fascinating.
Think about it in the future. You've able to see an article that is written by a computer because he had learned so much about, certain types of topics. And you can not tell that it's by computer or by humans, and that's both scary, but, potentially groundbreaking for many of the companies and applications in the world.
Patrick: I immediately jumped to the question of, do you think there'll be courses on Coursera created by a GPT3, like national language processing model.
Richard: not in the near future. I don't know, maybe like 10 years from now, 20 years from now. It may become possible. Yeah.
Patrick: That is going to be fascinating. Wow.
Okay. Question number four... What's your favorite most powerful question to ask or be asked?
Richard: I'd like to ask these few questions all the time.
So the first question I usually ask when people talk about a topic and the particular solutions, I always try to ground myself by saying "What we trying to accomplish with this?"
you'll be surprised on how many times many of us are including myself failed to answer these very simple questions.
Many of us are trained to be problem solvers, especially like engineers. Like you have gone through that every single day, and you're trying to solve a lots of problems every single day. And that we enjoy tackling interesting challenges, but at the same time, I think sometimes we can easily fall into the trap of finding a very sophisticated solution, but without clearly defining what the problems that we are trying to solve.
I just like to ask this question about, what are we trying to accomplish with that? How will success look like if we actually do it well?"
I think it often reset the conversation to really stick on, like, does this solve the problem versus like, is it a cool solution?
Patrick: That's great. Final question, Richard... is there a quote or mantra that you live by or a quote that's resonating with you right now?
Richard: Yeah. So and that this is actually coming from, so I used her to work for LinkedIn and our CEO, Jeff Weiner, I think he always talks about this concept and say "act like an owner." I think this is still something that I still took it to heart after many years, even though I don't work at the company anymore.
I feel that one of the major transition point of my leadership and leadership style was by internalizing this statement.
So I was a manager even before you know, I heard about his statement or, internalize it. But a lot of time, I think in a perspective about working in a job is there's a job description.
There's expectation on what you do within that job. But there are always a lot of things that is not in your job description. That as a leader, you need to take responsibility on trying to solve that problem. And many times before you actually take that leap and actually change your mindset... A lot of time, when problem happens, you think about like, "Well, there is a problem in the process, so someone else should fix it because someone else created that problem!"
Or there's a problem in that technology, then we'll say, "Well, team B should be working on that to solve that problem. I would just wait for a team B who actually got this done."
Well, majority of the time, actually, it's true that like we need someone else to actually help us to solve the problem. Once I internalized this message, the question strikes on me is always start from my side as well. What can I do about it? Is there something I can do about it? When I say something I can do about does not mean that I go and change the code immediately myself. Right?
Maybe sometime as simple as, let me figure out how do I describe this problem to a, tell the stakeholders. about under what situation I experienced this problem and what maybe my potential recommendation on how to solve that problem?
So I think making yourself to feel feel like an owner to a problem makes a huge difference on your perspective and really empowers like you and the rest of organization to try their best and make this company and make this environment, make the culture make your product much better than before.
Patrick: Richard, thank you so much for your time, your stories and your insights. I know that Jerry and I both really appreciate it and that's our members, the community, everybody listening to the podcast also really, really appreciates it. So thank you so much.
Richard: Thank you for inviting me to be here. So I had a lot of fun. Thank you.