Our first 1,000 customers

Phil Hewinson
18 min readSep 28, 2021


On May 15th, we had our first customer. A lady named Charlotte paid £1.49 to access all of her suggested careers with pathways into them. Since then we’ve run 16 separate product, ad and pricing experiments. And we hit the milestone of our 1,000th customer on September 3rd. We’re now at 1,255 customers and a total of £12,082.95 in revenue. We even signed up our first two business customers too — Eton College and Middlesbrough College!

We’ve also made a ton of progress on our product. Laura completed the research of pathways into all of our careers. So every career now provides a user with the steps they need to take to get into it, along with the time and cost. We tell people what their transferable skills are into their careers of interest based on their current and past jobs. We show day-in-the-life-of videos for 200 of our 500 careers. And we allow people to add and remove careers from their final results.

It’s been 7 months since I last posted an update. And that’s because it’s been such an intense time that I’ve not had a chance to write an update till now. In this post I’ll walk you through the past 7 months — how our strategy and thinking evolved over that time, our journey to 1,000 customers, more on the improvements we made to the product, and where we’re heading next…

Evolution of our strategy

Our goal for 2021 was to generate £100k in revenue. We decided to prioritise commercialisation so that we can sustain and scale our impact.

Towards the start of the year we focused on a recruitment model. Our rationale was that by helping people get into a job they love, we’d be creating value for the employers who hire them. And so employers would come to us to find and pay to access the best candidates. And we decided to start by focusing on graduate recruiters.

So in March we started talking to university leavers and graduate recruiters to better understand their needs. We also broadened our customer interviews to many other segments so we could develop a broader understanding of needs to best identify the optimal target for our software. These included independent training providers, school and college career leaders, careers departments at universities and independent career advisors.

Over a six week period, Emma and I spoke to 40 people. These included 12 university students, 9 graduate employers/experts and 4 university careers advisors.

Graduate recruiters

As we started talking to graduate recruiters, we quickly realised that they didn’t have really strong needs around finding candidates. Many effective channels already exist to advertise jobs to students. And many companies already have fairly mature systems in place to screen and qualify candidates. There were some needs around finding candidates for certain roles (such as sales, where it’s hard to test for soft skills) and specialised roles. But the needs weren’t strong enough for us to prioritise this strategy.


So we pivoted to focus on selling access to our software to schools and colleges. We ran a handful of school and college trials over Zoom and in person, which taught us a tremendous amount and worked very well.

But as we talked to the students, their need for great career advice wasn’t that strong. I reflected on this in my own journey. When I was at school, I was more interested in my studies and grades, not the career I wanted to pursue. That was too far into the future.

There is a need in schools and I believe we can make a big difference there. But I think the needs of people who have already started their careers but are unhappy are much stronger. So I think we’ll come back and focus more on schools in the future.

Having said this, we did close a couple of sales! Emma and I went to Eton College and ran a session for their Year 11 group. We spent an hour with 200 Etonites in a large auditorium.

I was pretty nervous as we’d never tested our software on 200 concurrent users before. I knew that the quiz executes 32 million computations per user. So I was acutely aware that my single server would be running a few billion computations over a couple of minutes. Thankfully it all worked! And Eton purchased a site licence soon after!

We also sold a licence to Middlesbrough College who have 14,000 students!

Direct to consumers

In order to sell to schools, we had to paywall our product. Our quiz and software has always been free. Except for a small experiment I ran at the end of 2019 which didn’t yield a single purchase! Given this failed experiment and the fact that the 100 users we’ve spoken with over the past year have only ever paid for training and coaching when it comes to their careers, I wasn’t very optimistic people would pay for our software. But if we kept it free, we couldn’t really charge schools to access it.

So in mid-May we paywalled our product. Our goal was to lower the barrier to paying as much as possible. So we chose a crazy low price of £1.49 for 6 months’ access — what we called a “launch offer”, reduced from £9.49 and valid for only a few days. And we updated the homepage, all the copy on our site and created new ads to reflect this.

I was surprised when we had 33 sales that weekend! I assumed no-one would pay given our user research. So this taught me to not hold on to what people say too strongly. It also taught me not to be afraid to charge users money!

The plan was to focus on selling to a couple of other organisation types next — private training providers and career coaches. But given this initial surprising success, we instead decided to double-down on selling directly to consumers using Facebook and Instagram ads.

Our journey to 1,000 customers

Over the past 4 ½ months, we ran 16 separate product, ad and pricing experiments with the goal of optimising the ROI (return on investment) on our ad spend. For our first experiment, we spent £642 on Facebook and Instagram ads over four days and generated £49.17 in revenue (33 sales at £1.49 each), which gave us an ROI of 7.7%. The work we did over the following 4 ½ months, and 15 more experiments was focused on driving this number up as high as possible.

We’re now getting closer to an ROI of 100%. We still have a few pricing experiments to come. And we’re not even using email yet to re-engage our users. So if a user doesn’t decide to buy in the moment, they may never come back. But once we capture their email address, we’ll have more opportunities to up-sell non-payers and get payers to invite their friends or even gift it to them. So we have quite a few levers left to drive the ROI higher.

We track each experiment (what we did and the results) in a big Google Slides presentation. It currently numbers 182 slides! We measure everything!

Facebook and Instagram ads

The primary channels we use to acquire new users are Facebook and Instagram. We tested other channels, such as Google, Tiktok and Snapchat, but we couldn’t get them to work very well on the first try. And we have very limited capacity. So we’re keeping with just Facebook and Instagram for now.

Ad creative

For our first experiment, we tried a variety of different ads. Emma designed this price comparison ad, which performed far better than the others:

But as we increased the price, we had to update this ad. So here’s what we came up with:

Narrow vs broad

Another big debate we had with a variety of experts over the past few months was whether we should target people in a narrow or broad way. Facebook provides advertisers with very powerful targeting capabilities — like demographics, location, interests etc. One school of thought is to use these targeting capabilities to decide who our ideal customer is and find them that way. Which sounds very sensible. This is called “narrow” targeting.

But another school of thought is to target as wide an audience as possible, report back to Facebook each time a user pays, and let Facebook optimise using its own data to find the ideal user. The argument here is that Facebook has far more data than we do about users. And its AI is far more effective at optimising the targeting than we could ever be. The bigger the audience, the better Facebook can optimise its targeting. There are counter-arguments, but this approach also sounds very sensible. This is called “broad” targeting.

As we started talking to experts, everyone fell into one camp or the other. Almost a 50/50 split in fact! So we figured we should test both. I spoke to a few ex-Facebook colleagues who worked on ads, and they all suggested broad targeting. As did an agency that I respect. So we focused on broad targeting most of the time. But we also spent a month testing narrow targeting.

For us, broad targeting yielded better results. And it scales better with less effort. So this is our preferred approach for now.

iOS privacy headwinds

One challenge we’ve faced over the past few months was due to an update Apple made to it’s iPhones back in April. It essentially meant that an increasing number of users since then have disabled the ability for apps to track them. Which essentially means that Facebook targeting doesn’t work as well.

This has had a material impact on ad performance for most companies from talking to people in the industry. I’ve heard reports of ad performance dropping by 30–40% for most companies over the past few months. And as we started our experiments at the beginning of this period, it has meant that we’ve had to face this headwind. We’ve actually made great progress despite this. But it’s been much harder to improve our performance than it would otherwise have been.

Optimising our funnel

In the tech industry, the stages a user goes through from landing on a homepage to eventually making a payment is called a “funnel”. Think of it as many users entering at the top of the funnel, but only a few make it to the end and pay.

A funnel can be optimised by measuring the drop-off at each step, looking for the biggest drops, and running product tests to see if these can be improved. We made many such improvements over the past few months, and have improved our funnel quite a bit. In our first experiment, 2% of people who clicked “Get started” on our homepage went on to pay. But in our latest experiment, this has increased four-fold to 8%. This isn’t all attributable to our funnel improvements, but the changes we’ve made here have helped a lot.

Small changes can make a big difference! For example, this was the “unlock” card we showed users in their final results to get them to pay on experiment 1:

In a later experiment, we updated it to this unlock card:

This doubled the number of people who clicked “unlock”, and increased our revenue by 80%!


We’ve experimented with different price points and we still have a few experiments left to do. Our starting price was £1.49, which was really low. This was really just to see if we could get people to pay at all. We rapidly increased this price to £4.49, £9.49 and then £14.49. Our ROI dramatically improved up to £9.49, but then dropped a little at £14.49. So we settled on £9.49 for most of our experiments.

But we’ve also dramatically improved our product over the past few months too (more on this later). As well as our ability to communicate the value of our product. So we’ll be testing higher prices in the next couple of weeks to decide where we keep our price long-term.

As we released new product features, we were then able to test tiered pricing. We had enough features to split them up into two tiers — standard and premium. So we tested this a couple of weeks back, with standard costing £9.49 and premium costing £19.49.

The initial results are promising, although we’ll need to make a few more tweaks. But we’ll probably keep a tiered approach.

How we charge

We initially gave users access to two of their final results careers for free, along with all information such as their personalised pathways into them. But to see the rest of their careers and pathways, they had to pay. And this worked pretty well.

But we wanted to experiment with other methods to ensure we had the most optimal approach that maximizes our ROI.

Showing more information for free

We ran a test whereby we gave more away for free, in the hope that people would realise the value of the product more and then pay. So we gave them a view on all their recommended careers, and all information, but they could only explore two. After they’d explored two, we’d lock all careers, so they couldn’t see any of the detailed information for any of them without paying.

And this completely failed. Our revenue dropped by about 80%! But for me, it was one of the best learnings. It made me realise that the value we provide users is advice and information. As soon as we’ve shown that information, the value has been exchanged and there is less value left to pay for. So we needed to go the other way, and provide as little information as possible until a user pays, so we could maximise the amount of value we had to exchange for payment.

Showing less information for free

So we worked hard to hide all final results careers, and only show them once a user pays. The challenge here was communicating the value of the product without showing the product. So we worked hard to redesign our homepage and show screenshots of the features. At the end of the quiz, we even show top insights for the user. Such as how many careers are in their final results and how many they can enter immediately and at no cost.

This worked much better and doubled the conversions from final results users to paying customers!

Charging user up front

The last main method of charging users is up front before they even start the quiz. One of the down sides to this approach is that you can’t capture their email address to try and up-sell them after their session. If instead you charge them after completing the quiz, you have the option of asking for their email first and sending them insights and interesting career content weekly, along with an up-sell. Which would probably improve ROI. For this reason, I wasn’t too keen on this approach.

But over the last week we kind of tested this approach by accident. As we launched tiered pricing, we communicated our price much more strongly on the homepage. We made it clear at the top under the “Get started” button:

And we introduced a new “Pricing” section:

I assumed tiered pricing would increase our ROI, but it actually dropped from 67% to 44%! But when we dug into the numbers, the conversion on the payment page was actually slightly stronger. It turns out that the conversion from the homepage to clicking “Get started” dropped from 40% to 26%! As we communicated the price so much on the homepage, people were making purchase decisions before they even started our quiz, which caused many of them to not even start! Which is very similar to the approach of charging users up front.

So yesterday lunch time I spent two minutes removing the new “Pricing” section and I changed the text under the “Get started” button to this:

And the effect was dramatic. Our conversion recovered back to 40% yesterday. And we had our fourth strongest day of all time, at 101% ROI! We’ll measure the results over the next week to be sure of the performance. But I’m feeling positive about it!

Our homepage

We must have made at least ten fairly large changes to our homepage over the past few months. We’ve spent a lot of time on it! I thought our homepage was in pretty good shape back in February, but it’s so much better now! Here’s a side-by-side view of our homepage at the start of June and now.

Even by June we’d made a number of changes and improvements, but I’m much happier with our latest version. Emma designed this new homepage as well as the latest way we charge users by giving them insights — all excellent designs that have made a big difference to our product and its performance!

Improvements to our product

Most of our time over the past few months was actually spent on improving our product. Since we started charging users back in May, we’ve always offered a “money back guarantee”. And we’ve had a few users request refunds. But what we’ve found is that as we’ve finished and shipped new features, we’ve had fewer and fewer refund requests, showing that people are getting more value out of the product.

Here’s a quick run-down of all the progress we’ve made with the product since the end of February…

Completed Pathways

While we had shipped our Pathways feature at the end of February (more on that here), we’d only enabled it for 81 of our 504 careers as each new career requires about 1–2 hours of research. So we hired Laura at the start of March to complete this research. It took her 5 months of intense work, and she completed them by mid-August. It was a truly valiant effort and really excellent research.

We gradually rolled out new pathways as we researched them, so the product became gradually more complete over that period. And once they were all complete, we could enable other features. Like the ability to sort careers by time, cost and starting salaries

We also made many incremental improvements to the Pathways feature over this period. For example, we surfaced more of the summary information for the best pathway on the final results screen, so users can see what each involves at a glance. And we highlight the careers that have pathways that can make use of their degree or other qualifications they may have.

And we use pathway information intelligently for career filtering in the quiz too. For example, for university undergraduate students, if a career doesn’t use their degree in any way but all pathways into that career require a different degree, a long apprenticeship or school-level qualifications they don’t have, we’ll exclude it from the quiz.

And we even started validating our pathways with professionals in each career. We validated about 10 careers, which gave us quite a few small improvements to make. And we plan to validate them all in the coming months once our priorities allow.

In-education variant

Our product back in February had a single flow that everyone went through. It had four screens that asked about their situation, location, education and experience. The problem with this is that it didn’t work too well for people in full-time education. For example, it asked about their past education, not what they’re currently studying.

So we designed more custom flows for different cohorts of users. And only asked for information relevant to each cohort. For example, if a user is in secondary school, we ask what school they go to and what year they are in. If they are in year 12 or 13, we also ask what A-level subjects they are studying. We don’t need to ask for their location, as we can infer that from their school (which we capture in a structured form). And we don’t need to ask about their experience, because we can assume they don’t have much of a work history. This essentially means that the school flow is just two screens, and all the questions are relevant.

And now that we know what stage of full-time education they are in, we can customise pathways to reflect this in the time. So if someone is in year 10, and say they are using the quiz in April, we know that they have 1 year and 4 months to complete their GCSEs and 2 more years to complete their A-levels if they go on to do those. And so all of their pathways that have GCSEs and A-levels can then be updated to reflect these timings for the user.

We tested this variant with a variety of students, through a number of school, college and university trials. All went well and enabled us to make a few tweaks to further improve the flows.


Back in February, each career had a “skills match” score. This was a feature we built back in November, which worked pretty well. But some users found the score confusing. And others wanted more granular information on what exact skills they had that were relevant to their careers of interest.

So we revamped this feature. We turned the score into a label, so that we only highlighted the top 25% of careers with the highest scores using the label “strong skills match”. And then when users explore a career, we created a new “transferable skills” section that shows the top skills they have in common with this career based on their work history. And clicking on each skill shows more information, as well as the past jobs they’ve done that they may have acquired this skill from.

This feature was built using the same data set as the original skills match feature, which you can read more about in this blog post. And initial user tests suggest this works well and creates lots of value for our users.


Other feedback we’ve had from users is that their final results are missing one or two careers they are interested in. So we created a way for users to add careers to their final results and delete them too.

The first way to add careers was to enable them to search for specific ones.

And then we built a way for users to browse our whole database of 500+ careers. We enabled browsing by industry and their shortlist of qualities based on the quiz. We also enabled sorting by time, cost, starting salaries and jobs. Given that this information is hyper-personalised and uses the same code that powers pathways, it’s actually a really powerful way to browse careers.


We had a fantastic intern join us over the summer. A Cambridge university student called Georgie. She spent 8 weeks with us and made a big difference in that short space of time.

In her first 3 weeks, she researched and found high quality day-in-the-life-of videos for 200 of our 500 careers. So we launched a feature that shows these videos as users explore their careers.

There didn’t seem to be high quality videos for the remaining 300 careers. So Georgie set about recording her own for a small sample so we could evaluate the quality, method and cost to complete them all. In total, Georgie recorded 8 videos of her own and they’re all excellent! Here’s the video she recorded for Cake Decorator:

We’ll complete this feature in the coming months as our priorities allow. But it’s great to have 40% coverage already, and knowledge of how much time and money it will take to complete them all.

What’s next

We’re focused on three strands for the rest of the year, which I’m hopeful will give us the metrics we need to take the company to the next level…

We’ll continue to optimise the performance of our current product. We have a couple of pricing experiments left, and we’ll probably also set up email engagement so that we can up-sell non-payers and get payers to share with their friends. These relatively low effort initiatives should have a positive impact on our ROI.

We’re also focused on creating more value for our existing customers. Our customers want to change careers, but it takes a lot of effort and we intend to provide more support. Possibly one-to-one support through a monthly subscription. We had a week-long offsite in the Peak District a few weeks ago to brainstorm and design this. Emma and Laura are now leading this initiative.

And finally in addition to our direct-to-consumer approach, we’re also exploring our approach of selling to organisations. There are a few organisations that have a lot of potential, including outplacement companies (that support people going through redundancies), training providers (who deliver paid courses to people who are upskilling but may not be sure what career to go into), career coaches and universities. We hope to make material progress in exploring these targets in the coming weeks and will double-down on any that show potential.

I’m hopeful we’ll have some great commercial metrics by the end of the year, which will give us the foundations we need to scale the company and our impact into 2022 and beyond!