15 Product Experiments in 6 Weeks

Phil Hewinson
48 min readOct 13, 2020

We’ve had a pretty intense and productive start to September. In 6 short weeks, we ran 15 product experiments! That’s an experiment every 2 days, between just two of us. And we’ve learnt a lot! It’s been quite the rollercoaster — full of ups and downs. We haven’t yet cracked commercialisation. But we do have a new strategic direction to head in and much better clarity and confidence in where we’re going. And some nice new features in our app too …

We did plenty of preparation leading up to these experiments. As I talked about in my last blog post, we spent 10 weeks interviewing users and customers, running a design sprint and prototyping our ideas. Emma also joined the team in August and she’s hit the ground running and did a lot of research to support our initial experiments.

This blog post will walk you through our journey over the last 10 weeks, what we learnt and where we’re heading next.

August Preparation

The first two weeks of August were pretty busy. Our goal was to complete a number of necessary projects before my two-week long break at the end of August. We’d then be in a position to start executing rapidly on our product experiments in September.

Emma joined the company on August 3rd. So I spent quite a bit of time helping her get set up and settled in. I also wrote up a midpoint report for Nesta, which was a requirement of being part of the CareerTech prize. And I wrote a blog post for Nesta’s blog on product discovery.

Brand & Design

We worked with the Forward Partners Studio team on our brand and visual design over 5 days. In preparation, I had some homework to do. I pulled together the latest articulation of our vision and mission. And described four personas (groups of people) that depicted our target users. I pulled together interesting competitors of ours as well as brands that I personally like. And we decided on six brand characteristics that best describe Would You Rather Be. These are:

  • Trusted
  • Approachable
  • Adventurous
  • Optimistic
  • Imaginative
  • Fun

We then spent a morning sharing all of this context between the team. And then we went through a few daily iterations of designs.

Idents, Colours & Typography

Our designer, Josh, produced lots of ideas and options for idents, colours and typography by day 2. These were all inspired by our six brand characteristics. Here are the idents he put together:

And the colours:

And finally the typography:

We discussed each one and decided on our preferred options.

Top 3 Designs

Josh then incorporated our preferred options into three distinctly different designs by day 3. And he designed the main screens of our current web app and new prototype in these designs to give us a flavour of what each one would look like.

Here are the three designs he put together:

We discussed and debated our favourites. We were leaning towards the first with the flag design initially. But we decided that the second set of designs was more versatile. And we felt it fitted better with our brand characteristics, especially being fun. So we settled on the second.

Our Design System

And the last step in the process was our full design system that Josh put together. This included our logo, colours, spacing and typography.

MVP Design

We then spent the second week of August working with the Studio team to fully design our MVP that we’d work towards building. This was the output from our 10-week-long process of product discovery. We also applied our new visual design to our MVP.

We started this process by storyboarding the end-to-end user experience of our MVP. This broadly included these steps:

  1. Complete our current career survey
  2. View the careers pages
  3. Shortlist careers
  4. Choose one career
  5. Setup a call with a career coach
  6. Have the call
  7. Followup actions after the call

Most of the product work would happen up to the point of setting up a call with a career coach. So we then mapped out all the functionality and information required at each of these steps to help Josh design each screen.

Emma and I also prepared all the copy and assets we needed. This included a blurb about the company and the team, text for our career change proposition, testimonials (from feedback we’ve been getting through our app), images and the information we’d include on the careers pages.

Here are some of the final mobile designs of our MVP, which I think look great! …

We haven’t yet implemented these designs. But we hope to have them built and live in the next month or so.

Data Work

I also spent some time at the start of August on our data to make sure we’re able to scale and so I don’t have to worry about it.

GDPR Compliance

Part of this work was to ensure we’re GDPR compliant. So I updated our privacy policy and implemented all the things the new privacy policy says. This included the principle of “data minimisation”, which means that we should only store the data we need. I had been storing the email addresses of users who hadn’t opted into future emails. So I deleted all of these and ensured we didn’t store them going forwards. I also implemented a cookie policy and banner on the site.

Security, Redundancy, Performance and Cost

Another part of this work was to ensure our data was secure. So in addition to the strong passwords we already have on our database and file storage, I introduced two-factor authentication wherever we store data, and in particular user data.

I put measures in place to mitigate data loss with redundancy. So I automated backups of our database. But in an intelligent way so that we still abide by data protection requirements. For example, if a user asks for their data to be removed, we need to remove it from everywhere within 30 days, including our data backups. So I architected backups so that this could be done easily and automatically (by essentially deleting any backups of personally identifiable user data that were over 30 days old).

I then focused on performance and cost. Reads and writes in the database are often the bottlenecks to app performance. So I made sure that our database collections were properly indexed. We also had a few million job-pair records in our database. That’s because every time a user answers one of our “Would You Rather Be” questions, it stores their answer in our database. Each survey is 100 questions, so with 41,000 users (our current user count), that’s 4.1 million pairs. But we don’t really need to store all of these. So I decided to back them up to Dropbox storage (much cheaper and would improve the performance of future reads and writes of pairs in our database).

Researching 10 Target Careers

The last piece of work to prepare us for building our MVP was to deeply research how we can help people get into the careers they want to pursue. This involved some desk research, but it mainly involved talking to people doing those careers and hiring managers too.

It would take months to do this research for all 500 careers in our database. And really the key hypotheses we wanted to test initially were whether people want our help with this, if we can actually help them and whether it is commercially viable and scalable.

Plus users of our app have on average 16 careers in their final results. So if we just researched 10 careers, then 1 in 3 people would see at least one of these careers in their final results list.

Emma and I spent some time figuring out which 10 we should focus on. We decided to go quite broad across industries and role types to maximise the breadth of our learning. We also wanted roles that were popular with our users, not negatively impacted by Covid and relatively accessible without too much specialist training. Here’s the list we landed on:

  • Business Analyst
  • Care Worker
  • Graphic Designer
  • Management Consultant
  • Marketing Executive
  • Photographer
  • Police Officer
  • Teacher
  • TV/Film Director
  • Vet Nurse

Emma then set to work researching these. Mainly by setting up conversations with people in those careers or hiring for those roles. I then went on holiday for two weeks at the end of August while Emma continued with this research.

By the time I came back on August 28th Emma had almost completed the research, having interacted with over 30 people. So now we were all set to start building our MVP after the bank holiday weekend on September 1st! …

An Experiment Every 2 Days

Between September 1st and October 9th we ran 15 product experiments. And each one involved quite a lot of work!

Most of the experiments involved writing quite a bit of code and some required pulling together quite a lot of data. But for each experiment we also needed to decide what we were testing and design it. And track the metrics we needed to with a stats page to surface the results. Plus some of them required setting up a new partnership, signing a contract and integrating with an API or new data source. And then we needed to run it (which usually involved sending a few hundred or thousand users through it), digest the results, decide our next steps and write it all up. And we did that every 2 days! …

Choosing the Right Metrics to Measure

Most of our experiments were designed to measure whether users signed up to propositions we put in front of them or whether we could generate revenue from affiliate programmes with third-party providers.

For signups, the core metric we measured was the cost for an individual sign up. This was a combination of the cost per click for the ads we used and the conversion rate from clicks to sign ups.

For our revenue-focused experiments, the core metric we used was the average revenue per final results user. In the industry we call this “unit economics”, which is the economics for an individual user, averaged across the whole user base. And a “final results user” is one that has gone through our entire survey and has reached the final screen with their career suggestions. So by measuring revenue per final results user, we can compare directly with our acquisition costs to see how close this gets us to being ROI-positive on ad spend.

Experiment 1 — Career Change Programme

Our first experiment was to implement a cut-down version of the MVP Josh designed a couple of weeks earlier. We essentially wanted to show an offer for a free “career change programme” to help people into one of our 10 target careers. So if a user had one of our 10 target careers in their final career list after going through our survey, they’d see these careers highlighted with the offer.

On clicking the offer, they’d get a full page description of what was on offer. And if they went to “apply”, they’d have to complete a Typeform. Which is where we captured all the information we needed to determine what help they needed, how motivated they were and essentially whether we then decided to set up a call with them. On completing the form, they would join a “waitlist”. And we could then choose who we wanted to have calls with.

It took us about three days in total to design this experiment, build it and then go live. Emma put together ten separate Typeforms (one for each of our 10 target careers) and pulled together all the copy and assets we needed. I built the pages in our web app to highlight the 10 target careers in a user’s final results and the single-page offer with a link to the appropriate Typeform. As well as all the tracking so we could measure what people did. For expediency, we decided not to implement the new visual design.

#1: Our Goals & Results

The main assumption we were testing was whether the offer would be compelling enough for people to complete the Typeform. Our specific goal was that 2.5% of people who see an offer would complete the Typeform.

And it completely failed. We didn’t get a single Typeform completion 🤦. In total, 97 people were shown an offer. But only 17 people clicked on it to read more (18%). And then 3 went into the Typeform. But no-one completed it.

The most surprising drop-off was that only 18% of people who saw the offer clicked through to read more. Our hypothesis behind that is that it was too big a jump to go from completing our survey to committing to a career change programme for a specific career.

Experiment 2 — Career Change Programme v2

So for experiment two, we spent a day improving the messaging and content to see if that improved any of our metrics. We were most interested in trying to improve the conversion of the first click (currently 18%) into the offer details. So to optimise for this, we changed the language on the final results for our target careers from “Apply Now” to “Learn More” to encourage people to click through to read more about those careers. And to further encourage people to click on these, we removed a lot of the extra information about our non-target careers we were showing in people’s final results to increase focus on our target careers.

Then on the detailed pages for our target careers, we added a lot of information from the research Emma had carried out. And below this detail, we included our offer. But we focused more on the benefits of the offer rather than the activities it might involve. And even though we kept it “free”, we put a “normal price” attached to it of £50 / month to try to increase its perceived value. And we added a deadline for applications.

#2: Our Goals & Results

The main assumption we were testing was whether our changes would drive up the click-through rate of the first click into the detailed pages for the target careers. Our goal here was an increase from 18% to 50%. We were also hoping to get a few Typeform completions too.

And it was another complete failure! Our click-through rate only increased to 23%, which fell far short of our 50% target. And we still didn’t get any Typeform completions. In total 162 people saw the offer. But only 37 clicked on it to learn more (23%). 6 people clicked “Apply Now”. But no-one completed the Typeform. So iterating on the content and messaging didn’t meaningfully improve our metrics. And 77% of people who saw at least one of our ten target careers didn’t click through to learn more.

We thought there might be two reasons for this:

  1. The average number of careers in people’s final results is 16. But we only show detailed information for one or two of these. Which are probably not people’s top careers to explore in many cases.
  2. People completing our survey have already invested 10–15 minutes to get to their final results. And they already know what their top suggested careers are. So they may not want to invest more time at that point to learn more.

So we decided to head off in two different directions for our next two experiments …

Experiment 3 — Career Change Programme for People with Intent

One direction we headed in was to test our career change programme with different people at a different stage in their career journey. Our current ads and product are particularly targeted at people who are exploring what career to pursue. But we really need people a little further along who were preparing or even actively applying for one of our ten target careers.

So we used different Facebook ads and targeting. And we tried using Google AdWords, as that’s particularly effective at capturing intent as people search for one of our target careers on Google.

We also built landing pages that went directly into our Career Change Programme proposition from these ads (a different page for each of our 10 target careers). Rather than requiring people to go through our 100-question survey. And we added some information about Emma and I to make the service seem more human and credible.

Emma even built a custom Wix landing page (i.e. a separate website) that served as a landing page covering all of our 10 target careers. This also included a much simpler and easier signup form on the web page itself rather than a 15-minute Typeform to complete.

We even looked to see if we could advertise on job boards. But unfortunately they only let you advertise actual jobs.

So we tested all variants of these ads and landing pages to see if we could get any sign ups. This all took us about two days to set up.

#3: Our Goals & Results

The main assumption we were testing was whether we’d get Typeform completions by capturing the right people at the right time when they are actively seeking to get into one of our target careers. And our goal was 100 Typeform completions with £1,000 of ad spend (£10 per signup).

Our results this time were mixed. We actually got some signups 🎉 … 24 in fact! So targeting people this way worked much better.

But we spent £644 on ads. So it worked out at £27 per signup, which was about 3x higher than our goal. And it was advertised as a free service, so we’d expect the cost per signup to be even higher if we charged. Which makes it challenging economically, especially as there would be an operational cost to deliver this programme.

70% of our signups came from three of our target careers — police officers, marketing executives and teachers. But of the ten police officers that signed up, only two of them lived in areas where the local force was currently recruiting. So we couldn’t do much to help the other eight.

Given these mixed results, and other learnings from later experiments, we decided not to invest much more in this direction. But we did take one extra step. Which was to try to set up initial coaching calls with about half of our users to measure the conversion to the initial call and whether we can actually help them (see experiment 8 below).

Experiment 4 — Discover Top Features for our Current Users

The other direction we took after experiments 1 and 2 was to try to get insights on what would be of most value to our current users who are going through our survey. Even though no-one engaged with our Career Change Programme proposition, it doesn’t mean they wouldn’t engage with a different proposition.

So to get these insights, we built a pre-survey before people entered our normal 100-question survey. In this pre-survey, we asked them all about their qualifications, employment status, current job and their happiness in it, what makes them unhappy, what stage of career change they’re at, their top challenges (specific to their career stage) and top features that stand out to them (again specific to their career stage).

We used the output of our design sprint and the 120 ideas we generated to populate the list of features we suggested. We came up with a list of about 50 features. And then we grouped them by one of our five career stages (exploring a new career, preparing for a new career, applying for a new career, applying for a job in the same career and staying in the same job). So each user only had to pick from a list of 10–15 features.

This took us a couple of days to build.

#4: Our Goals & Results

This experiment worked really well. We spent £215 on ads over a weekend and had 1,500 completions of our pre-survey. Which yielded a LOT of data. We then imported all of the data into a spreadsheet and analysed it.

Insights on Challenges & Features

1 in 2 of our users were at the first of our five career stages — exploring a new career. And the most common challenge these users had was that they don’t know what career they want to pursue (75% of this group). The most common feature they wanted was to map their personality, values, skills and qualifications to careers (>60% of this group).

The least popular career stage was preparing for a new career (only 10% of our users). This helps explain why we had such poor results in our first couple of experiments, as this was the stage we were targeting.

Those applying for a new job (in the same or a different career) added up to 22% of all users. And their most requested feature was personalised job alerts. A further 22% were happy in their current job.

Insights on Career Happiness

Another really interesting insight was that the average happiness of people who wanted to change careers was 2.7 out of 5. But those who wanted to stay in the same career had an average happiness of 3.7 out of 5. This suggests that if we can help people who want to change careers switch into a career they are happy to stay in, then their happiness would increase by an average of 40%. This is core to our mission of helping people find career happiness. A very helpful insight!

The final surprising insight was that the least common reasons for being unhappy in their careers were people’s bosses and colleagues. We assumed they would be the most common reasons. When in fact the most common reasons for unhappiness was lack of progression, salary and purpose.

Experiment 5 — Top Feature Propositions by Career Stage

These insights led into our next experiment, which was a big one! We essentially built proposition pages for the most popular features for each of the five career stages. We added some personalisation to them based on their survey results. And then made signing up really low friction (just a few fields on the same page). And we added a cost to each one, saying that we’d charge them once they are accepted onto the programme (they would join a waiting list, similar to our first experiments).

This all took us 2–3 days to figure out what each proposition should be and to build the pages. And then we switched our ads back on to see what happened.

#5: Our Goals & Results

We were testing a few assumptions with this experiment. Firstly, we wanted to see if our existing users would engage with some of our propositions. We wanted to learn which proposition at what career stage resonated the most. And we wanted to see if users were willing to pay for a proposition (in theory at least, as we weren’t actually charging them yet). Our specific goal was to get a 2.5% opt-in rate for our propositions (we aimed low as they had a cost attached to them).

This experiment was a success … just! We had a 2.3% opt-in rate overall. Of the 687 users who saw a proposition, 16 of them signed up. 13 of our 16 signups were for two of our five propositions. Our Personalised Career Mapping Service (at the explore stage) and Personalised Job Alerts (at the apply stage). We spent £245 on ads, which resulted in a cost of £15 per proposition sign up.

But in reality, we’d only be able to execute on one proposition at a time. As each one would require a lot of effort. So the effective cost to get a signup in each of our two most popular propositions are £35–40, which is a little steep. Although with some optimisations, we might be able to reduce this.

The big milestone here was that we had our first commitment to revenue in the company’s history. So Emma and I celebrated! 🥳

Next Steps

So we’ve identified two features that are most popular amongst our users. Mapping them in a richer way to careers that suit them, and giving them personalised job alerts.

Before moving on from testing these propositions though, we wanted to do what we did in experiment 3. To send users directly into our proposition pages rather than going through the 100-question survey, to see if our acquisition costs would be lower …

Experiment 6 — Send Traffic Directly Into Our Proposition Pages

So Emma set to work on new Google AdWord ads and Facebook ads to send people directly into our five proposition pages. I updated the code so these pages would work as standalone pages, rather than requiring people to go through our survey first.

#6: Our Goals & Results

Our goal was to see if we had lower acquisition costs than experiment 5. We were aiming for about £20 per proposition. We had some success on AdWords for job alerts (we acquired sign ups for about £10 each). But we struggled to get delivery with AdWords on the other propositions. And Facebook ad sign ups came in at a similar price of £30–40 per sign up.

A New Strategy Forming — Mapping + Marketplace

One big insight we gained from using Google AdWords was that keywords linked to “jobs” were very competitive, especially amongst job boards. This meant that job boards were spending a lot of money to acquire users from Google. And we have users. So we figured we could send the job boards our users in return for cash.

This led us down a path towards a new strategy.

Mapping Attributes of a User to Careers

First of all, we decided that focusing on better matching people to suitable careers is the right focus for us next. We currently match people based on interests. But we can also match them based on their qualifications, skills, values, constraints, personality and aptitude. We decided this was the right focus for us for a few reasons

  • It can be delivered with pure software, which plays to our strengths and enables us to scale with no marginal costs.
  • By focusing on the top of the funnel (deciding which career to pursue), we believe we can have a bigger social impact than helping someone enjoy a career that they’re not well suited to. Our data shows that we might be able to increase someone’s happiness by 40% if we help them into a career they enjoy.
  • Deciding what career to pursue is a problem that practically everyone in the world faces. Often multiple times in their careers. The fact that we have really low acquisition costs with our Facebook ads shows how keen people are for help with this problem. As does the fact that this was the most popular feature voted on by our users in experiment 4.
  • We’d capture people at the top of the funnel (right at the beginning of changing careers). Which means we could direct them to all the other services further down the funnel. Which are usually more commercial (the closer you get to an employment result, the more money there usually is). So owning the top of the funnel is very strategic from a commercial perspective as we could be sending users to very commercial players further down the funnel like job boards, training providers and career coaches.

A Personalised Marketplace for All Career-Related Services

And so the second part of our new strategy is to build a marketplace by sending people to downstream services after we help them discover the right career for them. We won’t solve all of a user’s career needs directly — only discovery. But as we’ll know so much about our users at that point, we can probably recommend the very best services that will help them.

For example, if they’re unemployed, have been homeless sometime in the last 12 months and live in one of ten London boroughs, we could direct them to Beam for support and funding for training to get back into employment. Or if they are a very smart young person from a disadvantaged background, we could direct them to Whitehat to get on an apprenticeship programme at Google or Facebook.

We could charge for many of these referrals. Which would enable us to build a sustainable business. And provide a free service to all users. Which would democratise career discovery and help everyone, no matter their background. This would maximise our impact and scale.

And Possibly Better Job Alerts

And the last feature that I’m keen to explore are personalised job alerts. This was our other most popular feature that people wanted. And delivering this well would enable us to have an ongoing relationship with the user, rather than losing them as soon as they’ve used our career discovery software.

As we could generate revenue each time someone clicks on a job ad in a job alerts email, we could significantly increase the revenue we make per user. If we build up a large base of users who are subscribed to email alerts, we could then potentially inject ads we source directly from employers. As we’d know so much about our users, we could do far more sophisticated matching of jobs to users than has ever been possible. For example, we could profile an employer’s best employees with our personality test (which we’ve not yet developed). And then match the results to users who are best suited. If we combined this with also matching values, aptitude, skills, qualifications and interests, employers could find far better candidates for the roles they need to fill than they’ve ever been able to.

So we’ll also explore this route in detail too.

Experiment 7 — The Next Step in our Personalised Job Alerts Proposition

For our next experiment, we wanted to take the next step towards helping the 20 people who signed up to our personalised job alert proposition (from experiment 5). So Emma put together a Typeform that we sent to all 20 users to better understand their requirements for job alerts.

#7: Our Goals & Results

Our goal was to get a strong response rate (>50%). And identify that we can feasibly solve the majority of people’s requirements or pain points. But sadly we didn’t get a good response rate. Only 1 person replied out of 20 emails we sent. And 3 emails even bounced back. We’ll chase a few more times. But it’s not a promising sign.

The learning here is that following up by email doesn’t seem to be an effective way to engage users. We’ll need to consider alternative ways to engage people earlier in future. But if we’re delivering an improved career discovery product that leads into a marketplace of personalised services, following up over email shouldn’t be as necessary.

Experiment 8 — Career Coaching Initial Calls

Our 8th experiment was to set up and run our first coaching calls to follow up from our first three experiments for those who signed up to our Career Coaching Programme. Even though we decided not to pursue this further, we decided to have a few calls anyway to see what we could learn about the needs people have with changing careers and how we might be able to help. Emma had already done a lot of research around this. So we figured that it wouldn’t take much effort to test these out and learn from them.

So we sent Calendly invites to people who signed up for this programme so we could have a 30-minute chat with them to better understand their needs.

#8: Our Goals & Results

We were hoping to have a 60% conversion to coaching calls. But sadly we only had 3 calls set up after emailing 11 people (27% signup rate). And those 3 people didn’t even turn up for the scheduled calls!

So another big learning I think is that people don’t really want to talk. It’s a much bigger ask of people to have a coaching call. So if we can engage them with software and give them value this way, that’d be better. Plus this very poor conversion rate to actual calls would increase our acquisition costs further, making this approach even less viable.

Experiment 9 — Jobs Nearby Links on Final Results

Our first main experiment to test a part of our new mapping + marketplace strategy was to send users to job boards to see how many would click through.

We added an extra screen to ask users for their postcode and how far they’re able to commute. Then on a user’s final results, we added buttons on each of their careers saying “View Jobs Nearby”. On clicking these buttons, they’d see a full list of open jobs near them for that career. We were basically sending them to Adzuna with their location, commuting distance and job name passed through, which displayed all of these jobs.

#9: Our Goals & Results — A Wild Success!

Our goal was to get 10% of users clicking through to at least one link. And a 15% clicks : users ratio (which assumes that on average people click 1.5 links).

We finally had wildly successful results! 42% of users clicked at least one link and we had a clicks : users ratio of 109%! That performed 7x better than we anticipated.

These results taught us that people really valued having links to live jobs in their area in one click. And people valued clicking on more than one job to view live jobs. Plus the economics are now potentially very attractive. We wouldn’t have to make too much per click to be unit profitable (i.e. to be ROI-positive on our ad spend).

Experiment 10 — Evaluate Adzuna’s Affiliate Program

So our next step was to try and make money from these clicks. Adzuna is also one of the 20 finalists of Nesta’s CareerTech prize, which meant it was easy to get an introduction to one of their cofounders. And Adzuna is probably the best jobs aggregator out there. So the ideal partner for what we were looking for.

Their cofounder put us in touch with their Head of Partnerships and we set up a call a couple of days later. We signed a contract for their affiliate program. And received tracking parameters to put on the URLs we sent users to.

#10: Our Goals & Results

Now it turns out that they pay on “second clicks”, not “first clicks”. So when a user clicks to go through to the Adzuna site and see a list of jobs, we won’t get paid. But if a user then clicks a second time to view one of those jobs in more detail, we get paid. That’s because Adzuna only gets paid on this second click and they share the revenue with us. We knew that second clicks would yield about 10p net revenue to us (for the UK market). So the big question was how many first clicks led to second clicks. If we had a 1:1 ratio, we’d make 10p per click to Adzuna. Which would make it very interesting economically due to our already low acquisition costs.

Sadly, it didn’t work out as I hoped. We had a 10:1 ratio of first clicks to second clicks. So 10 first clicks resulted in only 1 second click. We ran this for a few days and our final CPC (cost-per-click) of clicks to Adzuna ended up being 1.2p. With 762 final results users, we generated 104 valid second clicks at about 9p per click. Which yielded £9.20 in revenue, or 1.2p per final results user (essentially the same as the CPC of first clicks as we had a 1:1 ratio of final results users to first clicks).

Now this was a big milestone for the company — it was our first revenue! 🎉 … but a lot less than I was hoping.

Although 1.2p per user sounds like a tiny amount, it’s not quite as bad as it seems …

Cause for Optimism

Optimising Adzuna

When we analysed the jobs that we were sending people to, it turns out that only 30–40% of clicks actually resulted in any jobs appearing. That’s because we didn’t use the right job title in some cases. In other cases Adzuna doesn’t have the jobs on their platform. And in other cases, people live in areas where there aren’t any jobs in the categories they are searching for. So 60–70% of our first clicks couldn’t have resulted in second clicks because there was nothing to click on!

Most of Adzuna’s affiliate partners list jobs on their own platforms rather than linking to Adzuna to show the jobs (as we were doing). Adzuna have an API that enables this functionality so we can get a list of jobs for a certain career near a user. We could use this to display jobs directly on our final results page and we’d get paid for any clicks on those jobs. Adzuna also has a way for us to show the most profitable job ads, which we weren’t currently doing. Both of these optimisations should improve our revenues.

Additional Revenue Streams

We could also implement job alerts, which may significantly increase the lifetime-value of a user (the amount of revenue we’d make from a user over their lifetime). In fact, Adzuna said a job alert signup generates about 50p, which is equivalent to 5 clicks on job ads. And Adzuna has a product that would let us set this up and test it easily, where they deliver the job alerts rather than us. Long-term, we’d want to send the alerts directly as it’s more strategic. But in the short-term, it’ll be convenient for Adzuna to send them so we can test it with far less effort and time.

And then of course we could make revenue from all of our other marketplace partnerships. Like training providers and career coaches. And we could improve the relevance and overall product for users by improving career discovery. As well as better filtering and targeting the right ads to people (such as filtering for “entry-level” roles). These improvements should result in better engagement and more revenue per user. And we could even charge users for premium features. Like sending them a detailed report after they complete our various surveys.

A Pathway to Profitability

So 1.2p per user was a start. But there is a pathway where we could materially improve this. Even over the next few weeks and months. Once we can cover our acquisition costs, we’ll be profitable on a per-user basis. That’s because our product is pure software. So we have practically no operational costs to deliver our service.

If I’m honest, when I first saw the revenues I was pretty disappointed. After all these experiments and a wildly successful result in experiment 9 that exceeded our goals by 700%, we were only generating 1p per user! But I can see a pathway to this potentially working. So I’m optimistic.

We tested some of these optimisations and features in experiment 11 …

Experiment 11 — Direct Job Links on our Site + Job Alerts

For experiment 11 we used Adzuna’s API to list their jobs directly on our final results page. And to also give users the opportunity to sign up for Adzuna job alerts.

Initially we were only going to test job alert signups. But Adzuna only let users sign up for job alerts for one specific career that is within 10 miles of where they live. And only if there are at least 20 open roles for this career in their area today. Now this won’t be the case for many of our users and careers. Probably only those with lots of open roles nationally. So we couldn’t show this option to everyone for all careers. We had to first check that there were at least 20 open roles for each of their careers near them.

And that required integrating with their API (Application Programming Interface, which is a way for computers to talk to each other). We essentially had to ask them how many jobs were near a user for each of their careers. But those same API requests also returned the details of the actual jobs as well as the count. So by putting all that effort in to get the job count, we were also getting all the information we needed to list the actual jobs. Which is why we decided to take that small extra step and show these jobs on the final results page too. And any clicks on these jobs would result in revenue too, as they count as job ads.

A Better User Experience

We only tried to list Adzuna jobs if there were at least 20 open roles across the UK. This equated to 59% of our careers (298 of our 504 careers).

For 50 of our remaining 206 careers we linked to a site or alternative job board that specialised in that career. For example, for police officers we linked them to this site that shows which forces are recruiting (as police officer jobs are not listed on Adzuna). And then for the remaining 156 careers we linked them to Google Jobs as they tended to have more comprehensive results than Adzuna (but don’t pay us for clicks).

For the 298 careers we tried to show local job listings for with Adzuna, some of them wouldn’t return any jobs that are local to some users. For example, in London about 77% of our careers had at least one job nearby on Adzuna. But in Colwyn Bay (a more remote town in North Wales where I grew up!), only 14% of our careers have any jobs nearby. So if we tried to pull local jobs but didn’t get any, we’d instead show a link to Adzuna to list all of the jobs in the UK (rather than just their local area). And this would be an affiliate link which would give us some potential extra revenue.

And finally, some of our career names weren’t optimal search terms for Adzuna. For example “Airline Pilot” was better rephrased as “Pilot”. We rephrased about 100 of our careers when searching against Adzuna’s API. This improved our coverage by about 8,500 extra jobs across the UK.

All of these changes significantly improved the user experience from experiments 9 and 10. Each career would have far better and more relevant links to jobs than they did previously.

#11: Our Goals & Results

We were testing two assumptions with this experiment.

The first was that showing jobs directly on our platform would increase our unit revenues (up from 1.2p per final results user that we got from experiment 10). We were hoping for at least 2p per final results user (67% increase).

The second was that job alerts would drive additional incremental revenue. We were hoping for an additional 2.5p per final results user, which would be about 5% of our users signing up at 50p per signup.

Direct Job Links Performance

Our direct job links performed ok. They came in at 1.8p per final results user (50% improvement). 247 people got to our final results page, and 140 of them saw job ads for at least one of their careers (57%). Each of these users saw on average 8 job ads in total out of a potential pool of 44 job ads on average (we only showed the top 5 job ads for any given career). 30 of these 140 users clicked on at least one job ad (21%) and we had a 31% clicks : user ratio (on average people clicked on 1.5 job ads).

1 in 5 users clicking on jobs isn’t too bad. And there are lots of ways we could improve this further. I’ll go through these in our next experiment (12) where we tested most of these improvements.

Job Alerts Performance

Our job alerts test completely failed! We had 1 subscriber, which gave us 0.2p per final results user. The biggest problem was that only 40 of our 247 final results users saw a job alert (16%). And those that saw one only saw job alerts for on average 1.5 of their careers (where the average user has 16 careers in their final results screen).

The problem here was that there weren’t many careers with at least 20 open roles near to our users. So delivering Adzuna job alerts to our users isn’t very feasible. If we built our own job alerts platform and grouped jobs for all of a user’s careers of interest into one email, it might work better. Plus we could include remote jobs, especially for those who live in the sticks.

But we’ll probably only come back to try job alerts again if and when we meaningfully improve the performance of our direct job links. As our ability to deliver direct job links well will determine our ability to deliver relevant job alerts to people.

Experiment 12 — Direct Job Links on our Site v2

For experiment 12, we improved the direct job links we implemented in experiment 11 to see if we could improve our commercial metrics (currently at 1.8p per final results user). We made a series of small changes.

In experiment 11 we only showed 5 jobs for each career, even if there were more available. So we added a “Show More Jobs” button that pulled in another 10 jobs. Users could keep tapping this until 50 jobs displayed, which was the limit on Adzuna’s API.

We also showed salary information where it was available. Sometimes salaries were displayed in annual amounts (i.e. £10,000+) and other times they were in hourly amounts (e.g. £9). So I built some logic to determine this and display it accordingly. Also, some salaries had a minimum and maximum, some only had one value and others were predicted rather than actual. So I parsed all of this information to show it in an easy-to-read format for the user.

And finally we ordered the results in ascending salary order, factoring in whether the salaries were hourly or annual (by multiplying hourly salaries to the annual amount for comparison). This then highlighted the lowest paid jobs first, which are more likely to be entry-level jobs for that career, and so more relevant and accessible to our users who are considering changing careers.

#12: Our Goals & Results

Our goal was to improve on our unit economics from experiment 11 of 1.8p per final results user. So we set a goal of 2p.

Sadly and surprisingly this one didn’t work! We ended up with only 0.9p per final results user. 50% the performance of experiment 11.

We did increase the number of jobs a user saw on average from 8 to 22. But the % of people clicking on Adzuna job ads reduced from 21% to 12% (about half). And the average no. of clicks per clicker reduced from 1.5 to 1.2.

Our best guess as to why this was is that the main intent for our users is to see what jobs there are near them for each career of interest. But they don’t intend to go further at that moment. So by displaying salary and descriptive information, and giving them access to see more jobs, there is less reason to click through as that satisfies what many of our users are looking for. Which is better user value, but worse in terms of commercialisation. Although technically Adzuna should be getting better value per click as each user we send them should have higher intent on average to apply for a job.

We also tested this for an extra day with 18–24 year-olds to see how it would affect the performance (so far all of our tests have been with 25–44 year-olds). Its performance was worse with this younger demographic, coming in at 0.5p per final results user. With only 8% of users clicking vs. 12%. So about 40% worse performance. This suggests that job listings are more relevant for an older demographic.

Next Steps

The next obvious step to take here is to understand better why people weren’t clicking as we originally expected. So getting qualitative insights by user testing with people makes sense as a next step here. We essentially need to sit down with 5–10 people to observe them going through our app for the first time to see how they respond when they see these jobs listed. We may not do this as an immediate next step due to our other priorities. But we’ll do this when we’re ready to pursue job listings further.

And there are some big optimisations we can still do. For example, we’re still only covering 33% of all jobs on Adzuna (209,000 out of a total of 625,000 open roles in the UK today). At some point we’ll want to look at what those other jobs are and how we can cover them. We also have the option of showing job ads for the highest performing ones. This could double our cost-per-click, effectively doubling our revenue. And we could show remote jobs. This is especially useful for people who live in more remote areas with fewer jobs near them. Plus, there are many more remote jobs than previously existed due to a new work-from-home culture adopted by many companies in response to the pandemic.

Actual Adzuna Revenue

Our results that we recorded approximated revenue based on an assumed cost-per-click of 10p. We just measured the number of clicks we sent Adzuna and estimated the revenue we’d make. When we checked our actual revenue against Adzuna’s dashboard, it turned out to be a little different.

For our direct job listings over experiments 11 and 12, we actually generated £6.76 revenue from 100 clicks. Our data showed we generated only 76 clicks, which would have been £7.60 revenue at 10p per click. So we actually sent more clicks than we thought, but had a lower cost-per-click (6.6p vs. 10p). Which meant our revenue was 11% lower than we assumed, coming in at 1.1p per final results user rather than 1.3p.

But the nice surprise we had was that we generated a further £3.26 from other users we sent to Adzuna to view jobs on their platform. As I mentioned in experiment 11, if Adzuna’s didn’t return any jobs that were local to a user, we’d send them to the Adzuna website to view all jobs in the UK. And this would be an affiliate link that would generate revenue for us on “second clicks” if they were to then click on any of these jobs. It turns out that people did click on these. In fact, we generated 28 “second clicks” from 597 final results users, yielding an additional 0.5p per final results user.

The Economics of Adzuna Affiliate Links

So in total, blended across experiments 11 and 12, we actually generated 1.6p per final results user. If we just took the results of experiment 11 for direct job links of 1.8p per final results user, and applied the 11% down-multiplier to approximate actual revenue, we’re getting 1.6p. And then with the additional 0.5p from the other Adzuna affiliate links, we’re actually at 2.1p per final results user. Which is progress! And maybe we could increase this to 4–5p or more with some of the big optimisations I mentioned earlier.

Experiment 13 — Link to Udemy Courses

The next experiment we tested was a second type of “marketplace” offering — links to training courses that could help someone get into their preferred career. For this experiment, we showed a second button next to the “Jobs Nearby” buttons that said “View Courses” (we actually ran this experiment immediately after experiment 9, but it makes more sense to group all the job experiments together for this blog post). This then linked them to Udemy along with the career name. Which showed a list of relevant training courses for that career.

The cost to the user was £20 and upwards per course. Udemy also has an affiliate program, where they share 15% of any revenues generated from users we send them. So we thought we’d first test the click-through rates with this experiment before setting up on the affiliate program. Then we can test the volume of clicks before we determine what revenue they generate.

#13: Our Goals & Results

Our goal was to get a material number of users clicking on these buttons. Specifically we were hoping for 50% of the number of clicks our “Jobs Nearby” buttons were generating. It took us only a couple of hours to set this experiment up and the next day we had our results. We had 33% of the number of clicks of the “Jobs Nearby” buttons. So a little lower than our target, but still material. 1 in 5 users clicked on at least one button (with the average clicking 1.5 different buttons). And importantly, having these buttons didn’t seem to cannibalize the clicks of our “Jobs Nearby” buttons.

We also tested this with 18–24 year-olds (rather than 25–44 year-olds as we’d done so far). The results improved slightly, yielding about 10% more clicks. So younger people are less interested in seeing jobs and more interested in exploring training opportunities.

This was a strong enough result to proceed and get set up on Udemy’s affiliate program to see what revenues these generated on a per-user basis. We did this in experiment 14.

Experiment 14 — Evaluate Udemy’s Affiliate Program

We signed up for Udemy’s affiliate program, which took a few days before we were approved. But it required us to manually generate affiliate links for every unique link we wanted to have to Udemy’s courses. And each of our 500 careers needs a unique link. So we decided to do this for 150 careers for this test to save us time. Emma created these links, and spent some time making sure we used optimal search terms for relevant courses for each career. For example “Stunt Performer” yielded courses on theatre performing, so Emma instead used “acrobatics” as a search term as the results were a little more relevant.

I then ingested these links into our database and used them for the buttons we’d already created in experiment 13. We then enabled ads over a couple of days and waited on our results from Udemy to see if it generated any revenue (i.e. if anyone had purchased a course so we could share in 15% of the revenue).

#14: Our Goals & Results

We generated 160 clicks in total, but sadly no revenue. So it wasn’t compelling enough for our users to purchase a course. 15% of users clicked on a “View Courses” button (vs 21% in experiment 13). And clickers clicked on 1.3 “View Courses” buttons on average (vs 1.5 in experiment 13). These metrics were slightly lower because we only had ⅓ of the coverage we had in experiment 13, as we only included buttons with links for 150 of our 500 careers. So the click volumes were fine, but it was a failed experiment as it generated no revenue.

Similar to experiment 12, our next obvious step here is to user test with 5–10 people to see what they say and do when they click through to view courses on Udemy. That might give us some indication as to why people are not purchasing the courses.

Experiment 15 — Test Learning Curve Group’s Courses

And finally, our last experiment. And it was a chunky one!

Liam Goddard, an old friend of mine and one of our angel investors, introduced me to the Learning Curve Group. They are a training company who train 50,000 people a year. They have over a hundred courses that are delivered digitally. 50 of these courses are fully funded (by the Education & Skills Funding Agency), which means they’re free for users. And they each give a user a level 2 qualification, which is equivalent to a GCSE. They’re quite short too — lasting between 3 and 9 weeks.

The Learning Curve Group gets paid each time a user completes one of their courses. And they have a referral model whereby they pay partners £30 per completed course. So we wanted to set up an experiment to see if our users would be interested in these courses and to see if we could generate revenue this way.

Building this Integration

There are eligibility criteria for users though. They have to be at least 19 years old and live in England. But not in certain locations, like Liverpool. And they can’t be on an apprenticeship programme. So we used Facebook ads to ensure people were at least 19 years old. We built up a list of the 3,000 postcode prefixes in the UK (called “outcodes”) and mapped those that were eligible. And we asked users as part of the app’s pre-survey if they were currently on an apprenticeship programme.

Emma then produced a mapping of their 50 courses to our 500 careers. I wasn’t sure how much coverage 50 courses would give us across our 500 careers, so I was concerned a user wouldn’t see that many courses in their final results. But it turns out the mapping was really strong as their courses cover a wide range of soft skills applicable to many of our careers. The Learning Curve Group also complemented our efforts here with their own mapping. I then ingested this into our database. And then displayed courses next to each of a user’s careers based on this mapping, and only if the user is eligible. And each button contained a tracking link so that we’d know if they signed up to a course and we’d get paid if they completed one.

#15: Our Goals & Results

Our goal was to make some meaningful revenue from this integration. They pay us £30 per completed course. They told us that 94% of users who start a course complete one (probably partly because a user needs to pay them £125 if they don’t complete a course!). And 15–20 leads to their website results in a signup. If these metrics held true for our test, then each click to their website would be worth £1.40 to us.

But I assumed our performance wouldn’t be this strong. Mostly because their onboarding flow and course delivery wasn’t optimised well for mobile. For example, it takes about 20 minutes to signup and requires a user to upload a copy of their passport and most recent payslip. But our users will be doing this on mobile (as their entry-point is a Facebook ad) and I’m not sure they’ll be able to easily upload copies of these. I also tested the course delivery on mobile and it doesn’t seem to work — it only seems to work on desktop. So our users would probably need access to a computer to complete the course, which may cause a further drop-off.

We spent £620 on ads over the weekend and had 1,357 final results users.

Click Metrics

And our click-metrics were stronger than I expected. Which was mostly because our coverage of courses to careers was so strong. I assumed that only 20% of users would see at least one course, but it was actually 69% (942 of our 1,357 final results users). And the average user saw 59 courses! So obviously many courses were shown on multiple careers for users in their final results. I then assumed that 5% of users who saw courses would click. This was slightly better at 6.6% (62 unique users clicked on 83 courses in total, so 1.3 each on average).

And the most popular course was Understanding Climate Change and Environmental Awareness. 15% of users who clicked, chose this course. And it had double the number of clicks per impression compared to the next most popular course.

Conversion / Revenue Metrics

So if we had the same metrics as the average users on the Learning Curve Group’s site, then with 62 unique users clicking, we’d get about 4 completed courses. Which would be £120 in total revenue. Spread out across our 1,357 final results users, that would be 8.8p each. Which would be a pretty good result!

We literally just came off a call with them and sadly we didn’t get any sign ups. But it’s not all bad news. For some reason they saw twice as many users as us — we logged sending them 62 users, but they logged seeing 129 users. And 45 of those users clicked the “Enrol Here” button, so they had intent. But we lost all of them during the 10-step sign-up process somewhere. And I’m not too surprised given the challenges of signing up on mobile that I mentioned earlier.

We’re not sure at which step we lost those users though. So the team at Learning Curve Group are going to look into that and get back to us. Then we can see if we can optimise those parts in any way to see if we can get some conversions. Even if just 10% of people completed the sign up process, we’d be making about 10p per final results user. Which would be transformational for us!

Other Opportunities

And they’re keen to explore other partnership opportunities with us, which are really exciting. For example, as part of the government’s Kickstart programme they need to offer some career advice and guidance to their students. So we can see if we can help them with that. Plus we might be able to better guide their students to the right courses with our skills mapping product that we’ll be developing in the coming weeks.

And finally, we’ll explore other training providers who have a better mobile-optimised sign up process to see if we can run a test with them.

My Biggest Personal Challenges During this Phase

I found that the most challenging aspects of running these 15 experiments were dealing with the failures psychologically and coping with the intellectual intensity of such rapid learning, creativity and intense execution (in particular the coding).

Most of our experiments failed to meet our expectations and goals. And some yielded no positive results at all! While we did have some successes, it still feels like we’ve got a long road ahead of us before we can find a viable commercial model. But we learnt A LOT and it’s really helped with our strategy and direction.

And to sustain this pace and avoid burnout, I never worked more than 40 hours a week. I also went for 1–2 hours of walking every day to clear my head. And I went to the gym every morning for a 20-minute cycle before I started my day.

Our New Strategic Direction

The biggest positive outcome from these tests is the new strategic direction we now have. I talked a bit about this earlier at the end of experiment 6. But the learnings we then got from the rest of our experiments further clarified this direction for us.

Essentially, we want to help people find career happiness by matchmaking them to careers, services and jobs.

We want to build the best career discovery product on the planet. Our goal is to use software and data to help everyone discover a career that is really well suited to them. By taking into consideration their interests, qualifications, skills, values, personality, aptitude and constraints. We only do interests today. We’ll aim to do qualifications, skills, values and constraints by the end of the year! We think we can make this 10x better than anything else out there. And as it’ll be pure software, it will scale really well.

We’ll then know a lot about our users, so we should be able to connect them to just the right services to help them progress in their career. These could be career coaches, training providers or services to help people get a job like CV writing and interview preparation. There are even specialist services that help specific groups of people, like Beam who crowdfund the cost of training for homeless people to help them get back into work.

And finally, given the depth of knowledge we’ll have on our users, we hope we can match them to the jobs that suit them best.

Thoughts on Commercialisation

Our “marketplace” is one way we can make money, which is matchmaking people to services and jobs. That’s what we’ve been testing with half of our experiments over the past 6 weeks. And we’ve generated some revenue that way already. As we scale the partnerships, coverage and relevance, we should be able to meaningfully improve this revenue.

There are other ways we want to test making money too though. As we build out our career discovery product and create more value for users, we’ll have more options for testing ways of charging people. For example, once we map all of the attributes of a user, we can show them their results for free. But maybe charge them for a 60-page PDF going deep into who they are, the careers they map to, insights on the labour market, their biggest skill gaps and how to fill them etc. Maybe people would pay £5 or £10 for this report?

And if that worked, we could then sell into schools and universities who could offer this to all of their students. We could offer it at a discounted rate, which would give us annual recurring revenues from a new class of students each year.

And the last model I’m keen to test is licensing our technology to third-parties. We’ve already had a few businesses express interest in licencing our career discovery software. This can be valuable to job boards to help people discover which jobs they might be interested in, or training providers to help people identify the best courses to meet their career goals.

We’ll be testing all of these models in the new year once we’ve further built out our career discovery product. And if they all work, maybe we’ll have a hybrid business model with multiple revenue streams.

Commercialisation vs. Social Impact

I’m doing this startup to make a difference. My personal career goal is to make a material difference to the wellbeing of millions of people. This startup intersects with my personal mission by trying to help millions of people find career happiness.

I believe that the further up the funnel you go (helping people discover the right career for them rather than optimising the wrong career), the more impact we’ll have. But career discovery is harder to commercialise than optimising an existing career. Because most of the commercial value sits with employers (they pay a lot for talent) and an existing professional with skills is more valuable to an employer. And career changers typically have less disposable income than experienced professionals, making it harder to commercialise with a direct-to-consumer model.

I also believe we can have more impact by helping someone earlier in their life. Not only have they got more years to enjoy a happier career, but it’s much easier to make career changes the younger you are. And the most impactful stage is probably at school. But younger people are also less valuable to an employer, compared to an older, more experienced person. Plus younger people have less money.

So by focusing on social impact, it does feel like a harder road. But it doesn’t mean it’s impossible. And it also means that the problems we’re solving are probably more neglected. So I say onwards!

What’s Next?

Running these product experiments over the past few weeks has been demanding. So I’m relieved to be moving into a different phase. We’ll be focusing on building out user value for the rest of this year. We’ll be doing this in two ways. Firstly, we’ll improve career discovery by mapping people’s qualifications, skills, values and constraints to careers. And secondly we’ll be building out our marketplace to connect people to more targeted services that will help them with their career.

I see this next phase as increasing the number of lego blocks we have. We’ve built a few features so far. We’ve built a pretty good career discovery product based on interests. We’re also capturing quite a bit of information about our users. And we’re showing them somewhat relevant training options and jobs near them.

Over the next few weeks we want to build out more features to help people in more ways. I think of these features as “lego blocks”. We’ll then have twice as many lego blocks to play with in the new year. We can then test them in different permutations to see how we can best help people. We’ll do this in tandem with qualitative user testing. And we can test these features in different permutations commercially too. And finally we want to work on our marketing — both channels and propositions. Having more features will give us more options in the tests we can run in all of these areas.

I always remember a saying I heard at Facebook by Boz, a VP there. He said we need to build the river before we build the water wheel. That means we need to create the value for the user before we can really commercialise it. So that’s what we’ll be focusing on next.

A New Nesta Prize

We’re currently part of Nesta’s CareerTech prize. They gave us a £50k grant, along with 19 other finalists. And the winner and runner-up will get a further £120k and £80k respectively in the new year. So we’re keeping one eye on that to ensure we put a good application forward.

But Nesta have also recently announced a new prize in response to the challenges the pandemic is causing in people’s lives. It’s called the Rapid Recovery Challenge. And there’s a lot more money up for grabs.

They have two streams — jobs and money. We’ll be focusing on jobs of course. The focus of the challenge is to build tools and services that improve people’s access to jobs. Focusing on those hardest hit by the economic shock resulting from Covid-19. The deadline for this application is October 26th, so we’re working hard on this. There will be three for-profit semi-finalists on the jobs stream and each get £125k, with up to another £350k the following year for the winner. This prize is very relevant to us and would enable us to move quite a bit faster.

Onwards!

If you’ve read this far, thank you! This is my longest blog post so far, but I wanted to capture our journey the past 10 weeks as I’ve personally learnt a lot and I hope it has some value for you too. I personally really value the responses I get from friends who read my blog posts with offers of help or insights. So if you have any, please do get in touch with me.

My hope is that by the time I write my next blog post, our product has improved dramatically. So stay tuned and thanks for following along!

--

--