15 Product Experiments in 6 Weeks

August Preparation

The first two weeks of August were pretty busy. Our goal was to complete a number of necessary projects before my two-week long break at the end of August. We’d then be in a position to start executing rapidly on our product experiments in September.

Brand & Design

We worked with the Forward Partners Studio team on our brand and visual design over 5 days. In preparation, I had some homework to do. I pulled together the latest articulation of our vision and mission. And described four personas (groups of people) that depicted our target users. I pulled together interesting competitors of ours as well as brands that I personally like. And we decided on six brand characteristics that best describe Would You Rather Be. These are:

  • Trusted
  • Approachable
  • Adventurous
  • Optimistic
  • Imaginative
  • Fun

Idents, Colours & Typography

Our designer, Josh, produced lots of ideas and options for idents, colours and typography by day 2. These were all inspired by our six brand characteristics. Here are the idents he put together:

Top 3 Designs

Josh then incorporated our preferred options into three distinctly different designs by day 3. And he designed the main screens of our current web app and new prototype in these designs to give us a flavour of what each one would look like.

Our Design System

And the last step in the process was our full design system that Josh put together. This included our logo, colours, spacing and typography.

MVP Design

We then spent the second week of August working with the Studio team to fully design our MVP that we’d work towards building. This was the output from our 10-week-long process of product discovery. We also applied our new visual design to our MVP.

  1. Complete our current career survey
  2. View the careers pages
  3. Shortlist careers
  4. Choose one career
  5. Setup a call with a career coach
  6. Have the call
  7. Followup actions after the call

Data Work

I also spent some time at the start of August on our data to make sure we’re able to scale and so I don’t have to worry about it.

GDPR Compliance

Part of this work was to ensure we’re GDPR compliant. So I updated our privacy policy and implemented all the things the new privacy policy says. This included the principle of “data minimisation”, which means that we should only store the data we need. I had been storing the email addresses of users who hadn’t opted into future emails. So I deleted all of these and ensured we didn’t store them going forwards. I also implemented a cookie policy and banner on the site.

Security, Redundancy, Performance and Cost

Another part of this work was to ensure our data was secure. So in addition to the strong passwords we already have on our database and file storage, I introduced two-factor authentication wherever we store data, and in particular user data.

Researching 10 Target Careers

The last piece of work to prepare us for building our MVP was to deeply research how we can help people get into the careers they want to pursue. This involved some desk research, but it mainly involved talking to people doing those careers and hiring managers too.

  • Business Analyst
  • Care Worker
  • Graphic Designer
  • Management Consultant
  • Marketing Executive
  • Photographer
  • Police Officer
  • Teacher
  • TV/Film Director
  • Vet Nurse

An Experiment Every 2 Days

Between September 1st and October 9th we ran 15 product experiments. And each one involved quite a lot of work!

Choosing the Right Metrics to Measure

Most of our experiments were designed to measure whether users signed up to propositions we put in front of them or whether we could generate revenue from affiliate programmes with third-party providers.

Experiment 1 — Career Change Programme

Our first experiment was to implement a cut-down version of the MVP Josh designed a couple of weeks earlier. We essentially wanted to show an offer for a free “career change programme” to help people into one of our 10 target careers. So if a user had one of our 10 target careers in their final career list after going through our survey, they’d see these careers highlighted with the offer.

#1: Our Goals & Results

The main assumption we were testing was whether the offer would be compelling enough for people to complete the Typeform. Our specific goal was that 2.5% of people who see an offer would complete the Typeform.

Experiment 2 — Career Change Programme v2

So for experiment two, we spent a day improving the messaging and content to see if that improved any of our metrics. We were most interested in trying to improve the conversion of the first click (currently 18%) into the offer details. So to optimise for this, we changed the language on the final results for our target careers from “Apply Now” to “Learn More” to encourage people to click through to read more about those careers. And to further encourage people to click on these, we removed a lot of the extra information about our non-target careers we were showing in people’s final results to increase focus on our target careers.

#2: Our Goals & Results

The main assumption we were testing was whether our changes would drive up the click-through rate of the first click into the detailed pages for the target careers. Our goal here was an increase from 18% to 50%. We were also hoping to get a few Typeform completions too.

  1. The average number of careers in people’s final results is 16. But we only show detailed information for one or two of these. Which are probably not people’s top careers to explore in many cases.
  2. People completing our survey have already invested 10–15 minutes to get to their final results. And they already know what their top suggested careers are. So they may not want to invest more time at that point to learn more.

Experiment 3 — Career Change Programme for People with Intent

One direction we headed in was to test our career change programme with different people at a different stage in their career journey. Our current ads and product are particularly targeted at people who are exploring what career to pursue. But we really need people a little further along who were preparing or even actively applying for one of our ten target careers.

#3: Our Goals & Results

The main assumption we were testing was whether we’d get Typeform completions by capturing the right people at the right time when they are actively seeking to get into one of our target careers. And our goal was 100 Typeform completions with £1,000 of ad spend (£10 per signup).

Experiment 4 — Discover Top Features for our Current Users

The other direction we took after experiments 1 and 2 was to try to get insights on what would be of most value to our current users who are going through our survey. Even though no-one engaged with our Career Change Programme proposition, it doesn’t mean they wouldn’t engage with a different proposition.

#4: Our Goals & Results

This experiment worked really well. We spent £215 on ads over a weekend and had 1,500 completions of our pre-survey. Which yielded a LOT of data. We then imported all of the data into a spreadsheet and analysed it.

Insights on Challenges & Features

1 in 2 of our users were at the first of our five career stages — exploring a new career. And the most common challenge these users had was that they don’t know what career they want to pursue (75% of this group). The most common feature they wanted was to map their personality, values, skills and qualifications to careers (>60% of this group).

Insights on Career Happiness

Another really interesting insight was that the average happiness of people who wanted to change careers was 2.7 out of 5. But those who wanted to stay in the same career had an average happiness of 3.7 out of 5. This suggests that if we can help people who want to change careers switch into a career they are happy to stay in, then their happiness would increase by an average of 40%. This is core to our mission of helping people find career happiness. A very helpful insight!

Experiment 5 — Top Feature Propositions by Career Stage

These insights led into our next experiment, which was a big one! We essentially built proposition pages for the most popular features for each of the five career stages. We added some personalisation to them based on their survey results. And then made signing up really low friction (just a few fields on the same page). And we added a cost to each one, saying that we’d charge them once they are accepted onto the programme (they would join a waiting list, similar to our first experiments).

#5: Our Goals & Results

We were testing a few assumptions with this experiment. Firstly, we wanted to see if our existing users would engage with some of our propositions. We wanted to learn which proposition at what career stage resonated the most. And we wanted to see if users were willing to pay for a proposition (in theory at least, as we weren’t actually charging them yet). Our specific goal was to get a 2.5% opt-in rate for our propositions (we aimed low as they had a cost attached to them).

Next Steps

So we’ve identified two features that are most popular amongst our users. Mapping them in a richer way to careers that suit them, and giving them personalised job alerts.

Experiment 6 — Send Traffic Directly Into Our Proposition Pages

So Emma set to work on new Google AdWord ads and Facebook ads to send people directly into our five proposition pages. I updated the code so these pages would work as standalone pages, rather than requiring people to go through our survey first.

#6: Our Goals & Results

Our goal was to see if we had lower acquisition costs than experiment 5. We were aiming for about £20 per proposition. We had some success on AdWords for job alerts (we acquired sign ups for about £10 each). But we struggled to get delivery with AdWords on the other propositions. And Facebook ad sign ups came in at a similar price of £30–40 per sign up.

A New Strategy Forming — Mapping + Marketplace

One big insight we gained from using Google AdWords was that keywords linked to “jobs” were very competitive, especially amongst job boards. This meant that job boards were spending a lot of money to acquire users from Google. And we have users. So we figured we could send the job boards our users in return for cash.

Mapping Attributes of a User to Careers

First of all, we decided that focusing on better matching people to suitable careers is the right focus for us next. We currently match people based on interests. But we can also match them based on their qualifications, skills, values, constraints, personality and aptitude. We decided this was the right focus for us for a few reasons

  • It can be delivered with pure software, which plays to our strengths and enables us to scale with no marginal costs.
  • By focusing on the top of the funnel (deciding which career to pursue), we believe we can have a bigger social impact than helping someone enjoy a career that they’re not well suited to. Our data shows that we might be able to increase someone’s happiness by 40% if we help them into a career they enjoy.
  • Deciding what career to pursue is a problem that practically everyone in the world faces. Often multiple times in their careers. The fact that we have really low acquisition costs with our Facebook ads shows how keen people are for help with this problem. As does the fact that this was the most popular feature voted on by our users in experiment 4.
  • We’d capture people at the top of the funnel (right at the beginning of changing careers). Which means we could direct them to all the other services further down the funnel. Which are usually more commercial (the closer you get to an employment result, the more money there usually is). So owning the top of the funnel is very strategic from a commercial perspective as we could be sending users to very commercial players further down the funnel like job boards, training providers and career coaches.

A Personalised Marketplace for All Career-Related Services

And so the second part of our new strategy is to build a marketplace by sending people to downstream services after we help them discover the right career for them. We won’t solve all of a user’s career needs directly — only discovery. But as we’ll know so much about our users at that point, we can probably recommend the very best services that will help them.

And Possibly Better Job Alerts

And the last feature that I’m keen to explore are personalised job alerts. This was our other most popular feature that people wanted. And delivering this well would enable us to have an ongoing relationship with the user, rather than losing them as soon as they’ve used our career discovery software.

Experiment 7 — The Next Step in our Personalised Job Alerts Proposition

For our next experiment, we wanted to take the next step towards helping the 20 people who signed up to our personalised job alert proposition (from experiment 5). So Emma put together a Typeform that we sent to all 20 users to better understand their requirements for job alerts.

#7: Our Goals & Results

Our goal was to get a strong response rate (>50%). And identify that we can feasibly solve the majority of people’s requirements or pain points. But sadly we didn’t get a good response rate. Only 1 person replied out of 20 emails we sent. And 3 emails even bounced back. We’ll chase a few more times. But it’s not a promising sign.

Experiment 8 — Career Coaching Initial Calls

Our 8th experiment was to set up and run our first coaching calls to follow up from our first three experiments for those who signed up to our Career Coaching Programme. Even though we decided not to pursue this further, we decided to have a few calls anyway to see what we could learn about the needs people have with changing careers and how we might be able to help. Emma had already done a lot of research around this. So we figured that it wouldn’t take much effort to test these out and learn from them.

#8: Our Goals & Results

We were hoping to have a 60% conversion to coaching calls. But sadly we only had 3 calls set up after emailing 11 people (27% signup rate). And those 3 people didn’t even turn up for the scheduled calls!

Experiment 9 — Jobs Nearby Links on Final Results

Our first main experiment to test a part of our new mapping + marketplace strategy was to send users to job boards to see how many would click through.

#9: Our Goals & Results — A Wild Success!

Our goal was to get 10% of users clicking through to at least one link. And a 15% clicks : users ratio (which assumes that on average people click 1.5 links).

Experiment 10 — Evaluate Adzuna’s Affiliate Program

So our next step was to try and make money from these clicks. Adzuna is also one of the 20 finalists of Nesta’s CareerTech prize, which meant it was easy to get an introduction to one of their cofounders. And Adzuna is probably the best jobs aggregator out there. So the ideal partner for what we were looking for.

#10: Our Goals & Results

Now it turns out that they pay on “second clicks”, not “first clicks”. So when a user clicks to go through to the Adzuna site and see a list of jobs, we won’t get paid. But if a user then clicks a second time to view one of those jobs in more detail, we get paid. That’s because Adzuna only gets paid on this second click and they share the revenue with us. We knew that second clicks would yield about 10p net revenue to us (for the UK market). So the big question was how many first clicks led to second clicks. If we had a 1:1 ratio, we’d make 10p per click to Adzuna. Which would make it very interesting economically due to our already low acquisition costs.

Cause for Optimism

Optimising Adzuna

When we analysed the jobs that we were sending people to, it turns out that only 30–40% of clicks actually resulted in any jobs appearing. That’s because we didn’t use the right job title in some cases. In other cases Adzuna doesn’t have the jobs on their platform. And in other cases, people live in areas where there aren’t any jobs in the categories they are searching for. So 60–70% of our first clicks couldn’t have resulted in second clicks because there was nothing to click on!

Additional Revenue Streams

We could also implement job alerts, which may significantly increase the lifetime-value of a user (the amount of revenue we’d make from a user over their lifetime). In fact, Adzuna said a job alert signup generates about 50p, which is equivalent to 5 clicks on job ads. And Adzuna has a product that would let us set this up and test it easily, where they deliver the job alerts rather than us. Long-term, we’d want to send the alerts directly as it’s more strategic. But in the short-term, it’ll be convenient for Adzuna to send them so we can test it with far less effort and time.

A Pathway to Profitability

So 1.2p per user was a start. But there is a pathway where we could materially improve this. Even over the next few weeks and months. Once we can cover our acquisition costs, we’ll be profitable on a per-user basis. That’s because our product is pure software. So we have practically no operational costs to deliver our service.

Experiment 11 — Direct Job Links on our Site + Job Alerts

For experiment 11 we used Adzuna’s API to list their jobs directly on our final results page. And to also give users the opportunity to sign up for Adzuna job alerts.

A Better User Experience

We only tried to list Adzuna jobs if there were at least 20 open roles across the UK. This equated to 59% of our careers (298 of our 504 careers).

#11: Our Goals & Results

We were testing two assumptions with this experiment.

Direct Job Links Performance

Our direct job links performed ok. They came in at 1.8p per final results user (50% improvement). 247 people got to our final results page, and 140 of them saw job ads for at least one of their careers (57%). Each of these users saw on average 8 job ads in total out of a potential pool of 44 job ads on average (we only showed the top 5 job ads for any given career). 30 of these 140 users clicked on at least one job ad (21%) and we had a 31% clicks : user ratio (on average people clicked on 1.5 job ads).

Job Alerts Performance

Our job alerts test completely failed! We had 1 subscriber, which gave us 0.2p per final results user. The biggest problem was that only 40 of our 247 final results users saw a job alert (16%). And those that saw one only saw job alerts for on average 1.5 of their careers (where the average user has 16 careers in their final results screen).

Experiment 12 — Direct Job Links on our Site v2

For experiment 12, we improved the direct job links we implemented in experiment 11 to see if we could improve our commercial metrics (currently at 1.8p per final results user). We made a series of small changes.

#12: Our Goals & Results

Our goal was to improve on our unit economics from experiment 11 of 1.8p per final results user. So we set a goal of 2p.

Next Steps

The next obvious step to take here is to understand better why people weren’t clicking as we originally expected. So getting qualitative insights by user testing with people makes sense as a next step here. We essentially need to sit down with 5–10 people to observe them going through our app for the first time to see how they respond when they see these jobs listed. We may not do this as an immediate next step due to our other priorities. But we’ll do this when we’re ready to pursue job listings further.

Actual Adzuna Revenue

Our results that we recorded approximated revenue based on an assumed cost-per-click of 10p. We just measured the number of clicks we sent Adzuna and estimated the revenue we’d make. When we checked our actual revenue against Adzuna’s dashboard, it turned out to be a little different.

The Economics of Adzuna Affiliate Links

So in total, blended across experiments 11 and 12, we actually generated 1.6p per final results user. If we just took the results of experiment 11 for direct job links of 1.8p per final results user, and applied the 11% down-multiplier to approximate actual revenue, we’re getting 1.6p. And then with the additional 0.5p from the other Adzuna affiliate links, we’re actually at 2.1p per final results user. Which is progress! And maybe we could increase this to 4–5p or more with some of the big optimisations I mentioned earlier.

Experiment 13 — Link to Udemy Courses

The next experiment we tested was a second type of “marketplace” offering — links to training courses that could help someone get into their preferred career. For this experiment, we showed a second button next to the “Jobs Nearby” buttons that said “View Courses” (we actually ran this experiment immediately after experiment 9, but it makes more sense to group all the job experiments together for this blog post). This then linked them to Udemy along with the career name. Which showed a list of relevant training courses for that career.

#13: Our Goals & Results

Our goal was to get a material number of users clicking on these buttons. Specifically we were hoping for 50% of the number of clicks our “Jobs Nearby” buttons were generating. It took us only a couple of hours to set this experiment up and the next day we had our results. We had 33% of the number of clicks of the “Jobs Nearby” buttons. So a little lower than our target, but still material. 1 in 5 users clicked on at least one button (with the average clicking 1.5 different buttons). And importantly, having these buttons didn’t seem to cannibalize the clicks of our “Jobs Nearby” buttons.

Experiment 14 — Evaluate Udemy’s Affiliate Program

We signed up for Udemy’s affiliate program, which took a few days before we were approved. But it required us to manually generate affiliate links for every unique link we wanted to have to Udemy’s courses. And each of our 500 careers needs a unique link. So we decided to do this for 150 careers for this test to save us time. Emma created these links, and spent some time making sure we used optimal search terms for relevant courses for each career. For example “Stunt Performer” yielded courses on theatre performing, so Emma instead used “acrobatics” as a search term as the results were a little more relevant.

#14: Our Goals & Results

We generated 160 clicks in total, but sadly no revenue. So it wasn’t compelling enough for our users to purchase a course. 15% of users clicked on a “View Courses” button (vs 21% in experiment 13). And clickers clicked on 1.3 “View Courses” buttons on average (vs 1.5 in experiment 13). These metrics were slightly lower because we only had ⅓ of the coverage we had in experiment 13, as we only included buttons with links for 150 of our 500 careers. So the click volumes were fine, but it was a failed experiment as it generated no revenue.

Experiment 15 — Test Learning Curve Group’s Courses

And finally, our last experiment. And it was a chunky one!

Building this Integration

There are eligibility criteria for users though. They have to be at least 19 years old and live in England. But not in certain locations, like Liverpool. And they can’t be on an apprenticeship programme. So we used Facebook ads to ensure people were at least 19 years old. We built up a list of the 3,000 postcode prefixes in the UK (called “outcodes”) and mapped those that were eligible. And we asked users as part of the app’s pre-survey if they were currently on an apprenticeship programme.

#15: Our Goals & Results

Our goal was to make some meaningful revenue from this integration. They pay us £30 per completed course. They told us that 94% of users who start a course complete one (probably partly because a user needs to pay them £125 if they don’t complete a course!). And 15–20 leads to their website results in a signup. If these metrics held true for our test, then each click to their website would be worth £1.40 to us.

Click Metrics

And our click-metrics were stronger than I expected. Which was mostly because our coverage of courses to careers was so strong. I assumed that only 20% of users would see at least one course, but it was actually 69% (942 of our 1,357 final results users). And the average user saw 59 courses! So obviously many courses were shown on multiple careers for users in their final results. I then assumed that 5% of users who saw courses would click. This was slightly better at 6.6% (62 unique users clicked on 83 courses in total, so 1.3 each on average).

Conversion / Revenue Metrics

So if we had the same metrics as the average users on the Learning Curve Group’s site, then with 62 unique users clicking, we’d get about 4 completed courses. Which would be £120 in total revenue. Spread out across our 1,357 final results users, that would be 8.8p each. Which would be a pretty good result!

Other Opportunities

And they’re keen to explore other partnership opportunities with us, which are really exciting. For example, as part of the government’s Kickstart programme they need to offer some career advice and guidance to their students. So we can see if we can help them with that. Plus we might be able to better guide their students to the right courses with our skills mapping product that we’ll be developing in the coming weeks.

My Biggest Personal Challenges During this Phase

I found that the most challenging aspects of running these 15 experiments were dealing with the failures psychologically and coping with the intellectual intensity of such rapid learning, creativity and intense execution (in particular the coding).

Our New Strategic Direction

The biggest positive outcome from these tests is the new strategic direction we now have. I talked a bit about this earlier at the end of experiment 6. But the learnings we then got from the rest of our experiments further clarified this direction for us.

Thoughts on Commercialisation

Our “marketplace” is one way we can make money, which is matchmaking people to services and jobs. That’s what we’ve been testing with half of our experiments over the past 6 weeks. And we’ve generated some revenue that way already. As we scale the partnerships, coverage and relevance, we should be able to meaningfully improve this revenue.

Commercialisation vs. Social Impact

I’m doing this startup to make a difference. My personal career goal is to make a material difference to the wellbeing of millions of people. This startup intersects with my personal mission by trying to help millions of people find career happiness.

What’s Next?

Running these product experiments over the past few weeks has been demanding. So I’m relieved to be moving into a different phase. We’ll be focusing on building out user value for the rest of this year. We’ll be doing this in two ways. Firstly, we’ll improve career discovery by mapping people’s qualifications, skills, values and constraints to careers. And secondly we’ll be building out our marketplace to connect people to more targeted services that will help them with their career.

A New Nesta Prize

We’re currently part of Nesta’s CareerTech prize. They gave us a £50k grant, along with 19 other finalists. And the winner and runner-up will get a further £120k and £80k respectively in the new year. So we’re keeping one eye on that to ensure we put a good application forward.

Onwards!

If you’ve read this far, thank you! This is my longest blog post so far, but I wanted to capture our journey the past 10 weeks as I’ve personally learnt a lot and I hope it has some value for you too. I personally really value the responses I get from friends who read my blog posts with offers of help or insights. So if you have any, please do get in touch with me.

--

--

Tech for Good Entrepreneur

Love podcasts or audiobooks? Learn on the go with our new app.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store