No products in the cart!
Please make your choice.View all catalog
The media loves to highlight the stories of startups who have become a “unicorn.” However, not many learn about the stories of these companies passing a much more significant milestone—their product-market fit. Today, I want you and me to go over a couple of these compelling product market-fit examples and extract some “lessons learned” from them.
Product-market fit indicates that whatever you were building and promoting has become successful in the market with a user base that loves your product and is happily paying for it. A product-market fit means that:
Finding product-market fit is an essential milestone for any startup and its product management team, as reaching it helps you become sustainable in the highly competitive market and sets the stage for your further growth.
Now let’s begin our journey with probably the best-known PMF story out there—Airbnb.
Everyone’s favorite accommodation booking service did not always look as good as it does now. In fact, this is what the first version of Airbnb (a.k.a. AirBed&Breakfast) looked like.
But Airbnb has come a long way since its first release. Apart from a great-looking website, it now also comes second in the online booking market with 6 million property listings and 300 million bookings in 2021 alone.
This massive success was a result of the hard work and the Airbnb team’s creative approach toward finding PMF and growing the service. Wanna know how they did it?
It all started when Joe Gebbia and Brian Chesky (Airbnb’s co-founders) were late on their rent for an apartment in San Francisco and had the idea of hosting a couple of designers at their home (who came to attend a conference) to cover the rent.
They knew that people might not be comfortable with living in somebody else’s house. So, to look trustworthy, they built a basic website where they offered their services to IDSA (a well-known design conference in the U.S.) attendees.
This was a success, as they ended up hosting a couple of attendees from the event.
This gave them an idea for starting a service that would connect the people looking for cheap accommodation with those ready to host guests in their apartments.
But they were not sure if there were people out there who would be willing to rent out their apartments. So, they went out and recruited potential hosts in Denver a couple of weeks before the DNC.
Afterward, the Airbnb team added a map feature on their website that allowed users to find hosts near the location of their interest (which, in this case, was the convention center where the DNC was taking place).
This “hack” was a success, and Airbnb soon became the go-to service for many conference attendees who wanted to book apartments near the event venue.
Once the Airbnb team knew that it had both demand (travelers) and supply (hosts) for its service, it started focusing on growth.
They made an assumption that good-looking photos of homes would result in more bookings. So, they hired a professional photographer who would visit the properties listed on Airbnb and take gorgeous photos of them. The result was a 3x increase in booking for these apartments and a significant boost to Airbnb growth.
Finally, the Airbnb team came up with an ingenious (and maybe not really ethical) tactic of letting hosts automatically repost the listing on Craigslist using bots and scripts. It would let the hosts reach a larger audience of travelers who would be interested in their apartments while avoiding the hassle of manually signing up and making a post on this listing website.
Thanks to all these hacks and clever tactics, Airbnb was soon able to demonstrate a clear product-market fit for its service and business model.
While many of us might dislike the Craigslist idea, we can all agree that the team at Airbnb did a great job at building their product and it is worth learning from their experience. In my opinion, there are two key “lessons learned” in this story:
Cheap validation of core hypotheses: There was a reason I highlighted three sentences in the story above—these are all core hypotheses for Airbnb. The first two represented the supply and demand for the service, while the third one was an example of a growth hypothesis.
Gebbia and Chesky knew that their idea was risky and there was a chance that people would not find Airbnb useful. So they found cheap ways to test their core hypotheses before investing more time and money into their service.
Clever usage of concierge testing: Before building anything, the team tried to do it manually first. The two prominent instances of it happening were when the team went out and started recruiting hosts before the DNC event, and when they hired a pro photographer.
Doing things manually is also known as concierge testing. It is one of the well-known tactics for validating your hypotheses that lets you succeed or fail with your idea before investing in building a product feature based on it.
You might recognize Segment as one of the prominent players in the customer data processing space. But did you know that it started as a tool to tell your professor that you are confused during the lecture?
That’s right, its original name was ClassMetric, and it looked like this.
The screenshot here is a result of a lecture at UC Berkeley, where the end part was a bit hard to comprehend for the students as the level of confusion (red line on the chart) peaked.
Before reaching its PMF as Segment, the folks behind it were struggling to survive as ClassMetric.
Their idea seemed to be good enough in the beginning. It even managed to secure a $600k investment. But that was before they had built an MVP (minimum viable product) and tested it on students.
When they had a workable product ready, they distributed it in different universities and started testing it on students. What they found out was not encouraging. The majority of students would communicate with each other and the professor using social media or email. Only 20% of students used ClassMetric when given the opportunity.
It meant that the customer need they intended to cover with ClassMetric was not strong enough for the students to change their existing habits and switch to the app.
So, the team decided to abandon ClassMetric and work on another product they called Segment.io. It was a product analytics tool with a specialization in segmentation. This idea was a failure too, as they did not acquire any customers within a year of working on it.
But there was another tool that they had developed during this period to support Segment.io. It was a database solution that was able to gather, organize, and distribute user data. Unlike the main product, this solution did gather significant traction and soon became the main product for the Segment team. The database tool was so successful, the founders eventually sold it to Twilio for $3.2 billion.
Segment’s experience is quite unique and we can definitely learn from it. Here are two of the most important takeaways that I want to point out in their story:
Test your core hypotheses before building an MVP: Yes, the core function of an MVP is to test hypotheses too, but there are much cheaper ways that can give you an early indication on the viability of your idea before you start development. We have an entire guide on validating startup ideas that you can check out for more information on this matter.
It is absolutely OK to pivot: If your idea fails, don’t think about giving up yet. You can always change the course of your product and try something else. Pivoting is normal and quite common in the world of startups. Sometimes you’ll discover that it is one of your side tools that people really want.
Did you know both Gmail and Google Ads were internal tools in Google before they became super popular? I rest my case.
Trilogy Education was a tech boot camp service that started in New York in 2015. The aim of the service was to deliver tech education to their potential customers in a non-traditional format and bridge the gap between the students and their dream job and profession.
While it started out with a 50% failure rate of its courses (students were dissatisfied, angry, and demanded refunds), the team behind it soon managed to get the service back on track and secure an acquisition contract of $750 million.
Now let’s see how they overcame their challenges and reached their PMF.
To increase the satisfaction rate for their courses and create an ability for the service to scale (there was no way they would scale with a 50% fail rate), the team decided to create a “magical” experience for their students and focus their efforts on increasing this “magic.”
The magical experience consisted of the following five components:
The Trilogy team started revising the teaching methodologies, class curricula, teaching styles of the instructors, and other aspects of their service to reach excellence in these five areas.
In order to measure the results of their efforts, the team devised a series of KPIs, including course NPS and the percentage of students who felt “unsupported” and started measuring them after each course.
With this data at their disposal, they started to identify the weak points in their processes and iterate on delivering fixes for them.
The result of all these efforts was a definite product-market fit and an offer to acquire the startup by 2U—one of the giants in the EdTech industry.
The team behind trilogy did a great job at solving the issues with their service and here is what we can learn from them:
Define KPIs and systematically measure them: If you want to improve something in your product, you need to measure it first. Trilogy was able to define its “magic” and systematically measure the KPIs that represented it. This process helped them have a clear overview of their performance and pinpoint weak areas.
Focus your efforts on improving the KPIs you have set: By focusing on fixing the weak points that you had identified with your KPI monitoring, you will be able to gradually improve your customer experience and eventually find your “magic” zone.
PaintBerri is an online painting software and a community for artists to create and collaborate together.
Unlike other startups here who successfully reached their PMF, this one is a story about “successfully not reaching a PMF”. That’s right, I did not misspell “successfully” as this is a story about failing correctly (failing, just like pivoting, is absolutely OK!).
Well, maybe using the term “failing” is a bit harsh for Katherine Tung, who founded PaintBerri in 2014 with an active user base of people who loved the product.
The product itself was a huge success, but it was the aspect of monetization that stopped the team from actively developing PaintBerri. Although they had lots of active and passionate users on the platform, the user pains that the product covered were not strong enough for the users to start paying for it.
The reason I consider this a “successful failure” is that the PaintBerri team used Lean approaches to the development and validation of their product. They were able to put the startup on hold in an early stage without wasting their (and their potential investors’) money.
The PaintBerri team read a variety of books on Lean development such as Dan Olsen’s Lean Product Playbook (I recommend you read it too) and followed the build-measure-learn principle. They built a small MVP that they shared with a closed beta community of 500. Then they developed their public beta version that looked like this:
The public beta gathered around 6,000 artists and painters to the platform and the team started validating and learning.
By gathering both quantitative and qualitative customer feedback with questionnaires and interviews, Katherine was able to pinpoint the key feature set of the product that was adding the most value for their users (it was their social/collaboration suite) and focus more on improving it.
Apart from this, they also started experimenting with different monetization techniques and measuring their performance. This is when the team found out that they have a good problem-solution fit (as people loved the product) but not a PMF as there was no tangible readiness to pay.
Gathering lessons learned from successful stories is great, but learning from failures is much more valuable. In the case of PaintBerri, their “successful failure” teaches us that:
Following all the lean best practices will not guarantee you success: The Lean Startup methodology is not a silver bullet. It will not guarantee you 100% success with your ideas. Instead, it will make sure that you are failing early without spending much time and resources on the idea.
It is absolutely OK to fail: Before building a successful startup, founders usually fail a couple of times and learn from their mistakes. Failure is not something to avoid or be ashamed of. The only thing to consider here is to make sure that you fail early to avoid wasting time and money.
Novos used to be an analytics tool for gamers to visualize their gaming performance and improve upon it. This product proved too hard to monetize, so they soon pivoted to providing customized training programs to gamers instead.
Unlike the original idea, the second one succeeded and achieved Novos a PMF. Now let’s see how they did it and what we can learn from the Novos team.
Before building the product itself, the Novos team had already amassed a significant community of gamers and enthusiasts on Discord and Overwolf.
They took advantage of this community to test their product and gather feedback both on key use cases and monetization potential. Just like PaintBerri, they found out that the value prop was not strong enough for the players to pay for an analytics tool.
So they talked to several thousand (!!!) of their community members and found out that many of them are sharing training guides with each other in PDF form using the community. This is where the idea for custom training programs appeared.
The community adopted it quickly and started paying for it. This traction was an indication to Novos that it can go outside the community and reach a PMF.
The main takeaway from the history of Novos is probably obvious. You should take advantage of the online (and offline) communities of your target audience as they will provide you with valuable feedback and become your early adopters.
Our last example of product-market fit is about Superhuman, which is an email client that helps you become highly productive with your email communication.
They started with lots of struggles (for one, they launched too late), but ended up solving most of their problems and subsequently creating a framework for reaching PMF—which they did!
This so-called framework was about making PMF measurable by setting a KPI for it and then devising a process to improve that KPI until they reached their product-market fit.
Rahul Vohra, the founder of Superhuman, chose the Sean Ellis Test as the key KPI. After running the test on early adopters, the team found out that the proportion of users who would be “very disappointed” if Superhuman did not exist was 22% (below the required benchmark of 40%).
So, the team decided to focus on the pain points of the group that was “somewhat disappointed”, see what were their underserved needs, and cover them with new features.
The framework of measuring a leading indicator metric and improving it was a success, as the team soon hit a 58% mark and reached its PMF.
The story of Superhuman proves that you can successfully measure your PMF using leading indicator metrics. In fact, we have an entire guide on analyzing product-market fit that can teach you how to do it.
Reaching a product-market fit is probably the biggest challenge that a startup has to face. Luckily, many others have already gone through this and amassed considerable knowledge that you can use to avoid the pitfalls along your journey.
I hope that the lessons learned from these stories will help you find your own PMF one day. But before that, I also recommend that you read a couple of other compelling guides that my colleagues have prepared for you. In particular, you can check out our how-tos on:
We also have a newsletter that you can subscribe to if you want to receive lots of product management goodies like this!