SaaS: How We Automated Free Trial Extensions To Increase Paid Conversions By 33%

At, we’re extremely passionate and confident of our purpose as a startup. We know what we want to do in this industry: we want to make other SaaS companies more successful.

But the exact way that we execute that purpose has been a very fluid process since we started work on our very first prototype in May 2013. Like many Lean Startups, our product needs to be able to adapt and evolve as we search for our own product-market fit.

And that means getting constant feedback from customers and potential customers.

“Would you sign up for this? Would you use that every day? Would you pay for this?”

You can ask potential customers these sorts of questions in interviews all day long, but the ultimate validator comes when they’re actually faced with that credit card form.

As the saying goes, “Pay up or shut up”.

We’d already validated our original purpose with a $29 Beta queue jump and $499 “Launch Partner” deal. But our product had changed considerably in the last few months, both in target customer and price point, and in some cases we were asking users to pay upwards of $299 per month.

Now here’s the honest truth: since we launched our public Beta, users were getting to the end of their free trial (after installing our tracking code) and then….. nothing. They weren’t paying. Our conversion rate sucked!

Something had gone wrong. Had we misunderstood our customer conversations? Had we been arrogant to assume these companies would actually ditch their existing customer analytics and put their faith in us? All of our feedback in emails and conversations had been great, everyone had told us they loved the product!

We were spinning and didn’t know what was wrong.

We needed to get very specific feedback. We already knew people were using the product from our own data tracking. But asking “Would you pay for this?” hadn’t translated into conversions. We needed to ask a different question to a different segment of our users.

“Why did you not pay, after using the product successfully during your free trial?”

This post is an explanation of how we segmented our onboarding users, and then how we implemented a super simple, very scaleable tactic to filter out our most engaged users and find out why they weren’t ready to start a paid subscription.

By the end of this process, we were receiving 27% more feedback, and increased our paid conversion rate by 33%!

The Different Segments Of Users In Our Free Trial

Once a user signs up for a free trial at, they end up in one of these 6 basic segments:

  1. Not interested at all. They churn within the first few minutes without even integrating our tracking code. They never come back
  2. Not interested in integrating our tracking code right now, but might come back to test when they have an appropriate project
  3. They tested the product, but they hated it! There’s too much wrong, and they’ll never come back to try it again
  4. They tested the product, and they loved it! But they just don’t need it right now
  5. They tested the product, they loved it, and they paid without problem at the end of the free trial
  6. They tested the product, and they liked it. They do have a need right now, but there’s at least 1 problem stopping them from subscribing

Ideally, we’d only have people in segments (4) and (5). Our goal was to try and find out what went wrong in all the other situations so that we can correct it.

However, each of these segments have different responsiveness to feedback requests and are at different stages of our AARRR funnel.

We decided to look deeper into each segment, and see where we could get the biggest wins.

1. Users Who Were Not Interested At All

For these users, there isn’t usually much we can do. While our marketing may have been effective at piquing their interest, at some point there was a disconnect between the product and the marketing.

They couldn’t even be bothered to test the product – perhaps because they assumed it wasn’t for their needs or because the effort was not worth the reward.

We’ve had really poor success with engaging these users in conversations – they don’t respond to exit interview emails.

How To Identify: They didn’t activate (install our tracking code). They never log back in after first signing up and they don’t open any of the onboarding emails

Action: Nothing right now. Just keep sending automated exit interview emails and come back to optimise later

2. Users Who Weren’t Interested In Testing The Product Right Now, But Might Come Back Later

We see these users opening our content marketing emails (linking to posts such as this one) and although their account remains inactive, we see a passive interest in our brand. We need to wait until they decide to come back and take more action so that we can engage them in a conversation about the product.

Right now, we’re happy for these users to stay on our marketing list.

How To Identify: They didn’t activate. They haven’t logged back into their account. They do open our content marketing emails

Action: Nothing right now. Just keep sending these users content marketing and see if they come back later

3. Users Who Tested The Product, But Hated It

These are the most painful users as a founder. Getting users through Activation for a product like is extremely hard. Most developer tools have activation rates around 10% – 20%, so losing a user who has gone through all that effort is particularly painful.

The only thing we can do with these users is keep trying to get them in an exit interview so that we can find out where we went wrong and how to correct it for future users.

We can learn from these users, but they’re very unlikely to become paying customers.

How To Identify: They activated (installed our tracking code). They seem to have looked around, but never came back. They don’t open our content marketing emails.

Action: Nothing right now. Just keep sending automated exit interview emails, which we can optimise later.

4. Users Who tested The Product And Loved It, But Don’t Need It Right Now

These users sting, but there’s not much you can do. A sale requires both interest and requirement.

Because we’re seen as a “cool” product, we get a lot of users test us out to compare to existing solutions or out of curiosity. But once it comes to paying, they don’t because they’re on an open source project (and wont see a financial ROI), they have a zero revenue startup (and no funding) or their employer wouldn’t purchase this.

While gaining some viral promotion via open source projects is cool, we were determined to optimise the product for revenue, and so we didn’t see a whole lot we could do with these users other than be patient.

How To Identify: They activated (installed our tracking code). They used the product a lot, but it looks like they have development/staging data. They churn as soon as their free trial expires

Action: Offer open-source projects a free account. Keep sending users content marketing.

5. Users Who Tested The Product, Loved It and Paid

We didn’t have enough of these users, but we wanted more! We’re trying to focus on a low-touch product and so ideally we’d like users to go through the onboarding and purchasing process with very little Sales Rep involvement.

How To Identify: They activated, use daily, invite colleagues and enter a credit card without fuss.

Action: n/a

6. Users Who Tested The Product And Liked It, But Have A Problem Preventing Them From Purchasing

These are the users we identified as the “low hanging fruit” in our free trials. They’d gone to the effort of installing our script and they’d continued to use the product, however, something was stopping them from paying.

They had activated, they had become sticky, but they weren’t paying. We knew we needed a tactic that was more than just sending an automated email. Manually reaching out wasn’t scaling (we’re a team of 3!) and so we needed to try something.

How To Identify: They activated (installed our tracking code). They were coming back each day to view customer profiles. They were opening content marketing emails. They hadn’t entered a credit card and had churned on the day that their free trial expired.

Action: Try offering a free trial extension in exchange for feedback, act on the feedback on a case-by-case and consider a highly engaged lead

Offering A Free Trial Extension

Once we’d decided to start offering free trial extensions, we knew that we didn’t want to do this via automated emails. We already have enough emails going to our users, and we’re always reliant on open rates for them to be effective.

We decided to put our extension tool directly into our product – after all, this is when the users are most engaged with our app and thinking about whether they need it.

Remember, the reason we’d started this in the first place was that the feedback from these 2 questions is very different:

“What would you pay for?” vs. “Why didn’t you want to pay?” 

We knew this feedback had to be exactly at the point of paying.

What Does It Look Like In Our App?

While it did need some coding, the total task of adding the was around 4 hours, including adding the admin screen (so that I didn’t need to access the database).

Here’s the modified subscription plan screen within the app:

Screen Shot 2014-06-09 at 22.47.27


Notice how we’re very specific about using the language “Why aren’t you ready to pay?” in the feedback form?

We want users to be really honest. We’ve done asking about “Do you like this?” and we’re now asking for money.

And here’s the admin screen we built to view the trial extension reasons, and also add an additional extension ourselves if needed:


Would People Still Pay If They Could Get It For Free?

One of our biggest reservations before we added this so blatantly on our payment screen was, would people just keep hitting “extend” instead of paying?

To try and deter this, we set the extension to 3 days. We figured this was basically “nagware” and that users who were just looking for a ‘free meal ticket’ would soon get fed up, however 3 days was long enough for me to intercept, discuss and issue, and then manually add a longer trial extension if needed.

As it turns out, we haven’t had a single case of a user abusing the trial extension option, and since implementing it users have been paying to subscribe despite having the option to get a free 3 day extension.

What Sort Of Feedback Did We Get?

As you can see from the screenshot above, we got way more feedback than we expected that was simply users who didn’t feel like they’d fully evaluated the product yet.

We’d waited 30 days to give them a “nudge”. This sucked.

We immediately dropped our trial period to 14 days.

In these case, the very fact they’ve extended their trial is a great indicator of interest, and provides a “warm leads” for me to focus on in conversations. Instead of waiting for 30 days to ask the hard question, we halved the time to get to this point.

I’ve received a 100% (so far!) email reply rate when I reach out to users who extend their trials simply because they haven’t had a long enough chance to evaluate the product.

However, uncovering these users is great, but didn’t provide us with the missing product feedback we needed. We already know from our usage data which users aren’t logging in often enough to test the product. Instead, we needed to know what users thought who had been actively using the product.

So, we were thrilled when we started seeing feedback come in such as:

  • “I’m worried about how the pricing is based on # of users and not # of active users. Can you explain please?”
  • “We can only pay by invoice”
  • “I couldn’t understand how funnels worked”
  • “I couldn’t find where to setup customer segments”
  • “I heard you guys calculate a ‘health score’ like Gainsight but couldn’t find that”
  • “We need to be able to manage users via the dashboard, not just API”
  • “We need the ability to drill into event meta data”
  • “Couldn’t see how to send emails based on events”

This was the golden sort of blockers we could resolve, and use to inform our product roadmap!

  • “I’m worried about how the pricing is based on # of users and not # of active users. Can you explain please?” – We updated our pricing to use ‘Active Users’. The user became a paying customer
  • “We can only pay by invoice” – We added invoice payments on Enterprise plans
  • “I couldn’t understand how funnels worked” – We got this one a lot. We walked the users through our funnels product. After many realised it didn’t suit their needs, we decided we needed to re-assess the funnel report and how it fitted into our overall product
  • “I couldn’t find where to setup customer segments” – We’ve been working on this as part of the new CSM product. We informed the customer and setup a product demo
  • “I heard you guys calculate a ‘health score’ like Gainsight but couldn’t find that” – Again, we’ve been working on this as part of the new CSM product. We informed the customer and setup a product demo
  • “We need to be able to manage users via the dashboard, not just API” – We fast tracked this feature (it’s small), the user became a customer
  • “We need the ability to drill into event meta data” – We reached out and explained this was coming up in our new CSM product release, and setup a demo
  • “Couldn’t see how to send emails based on events” – We pointed the user towards and explained we’ll support Alert emails but not marketing automation. We reworded some of our marketing to make this clearer

These are just a few of the most recent feedbacks we’ve received. Not all of these can be resolved into sales: some we can use to change our marketing or our self-serve support materials.

What was particularly interesting was all the feedback we received about using our 2 main data reports: Metrics and Funnels.

We actually learned just how many users really didn’t understand them very well, and it allowed us to dig much deeper into those users’ usage data. On the surface, they’d used the features perfectly. However, it wasn’t until we dug into conversations that we realised they’d only been testing the surface, but hadn’t managed to see the obvious ROI on them.

It was a great lesson for us in that product usage doesn’t always correlate into product engagement.

How Many People Used The Extension Option?

For the last 30 days, of all users who have reached the end of their trial and have been presented with our payment screen inside the app:

  • 9% of users entered their billing details without hesitation
  • 27% of users opted to receive a free trial extension
  • 64% of users just left the app

Before we had the free trial extension, the numbers for paid conversions were identical:

  • 9% of users entered billing details
  • 91% of users just left the app

So we were definitely getting feedback from users who would otherwise have just left!

With those free trial extension users, I can focus crazy hard on removing their obstacles, and communicating better about their objections.

And the final results, our paid conversion rate is now 12%!
(And all of the additional conversions are users who have had at least 1 free trial extension)

Even though we added 3% to our paid conversion rate for active users who reach the end of their trial, this is still lower than our targets. However, we have a lot more feedback to work with than we were getting from exit interviews, and it’s feedback from users who are the most likely to pay too. And it’s a 33% increase!


This was a long post to explain a simple tactic that we added to our billing and feedback process, but it’s a situation with many variables and factors at play and I wanted to be totally transparent so that other startups could see how this could apply to their product.

One of the biggest surprises for me was just how many users used the free trial extension because they had just “ran out of time” (after 30 days!) but eventually went on to become paying customers. Previously, these people would have simply let their trial expire and then not bothered sticking with the product.

I know many startups will offer discretionary extensions if you ask/beg, but we’re trying to be more transparent.

We have transparent pricing. We have transparent data policies. And now we have a transparent free trial extension policy. And it works great!

Published by

Liam Gooding

Liam is the cofounder and CEO of Trakio. Previously an engineer, he writes about growing subscription companies using data-driven techniques and inside glimpses to Trakio's own growth journey. He wrote a book, "Growth Pirate!" which discusses data-driven growth strategies for startups.

  • Wow ! What a post. Thanks a lot for sharing this.

    I definitely will use some of the tips you gave us and you tried here. It’s awesome and I love the idea the user give a feedback and you give him back a free more trial period.

    We already experiment at SunnyReports that the moment and the place to ask for feedback is critical. Just by making a change on this point, we increase our users’ feedback by 50%.

    • Great to hear and thanks for the kind words Bastien

      SunnyReports seems like a really valuable tool for agencies. I used to own an agency and something that that could save hours per week!

  • Liam Great post,

    we’re going to add this to our own SaaS and let you know how it goes for us!

    • Fantastic! Removing the “Email us to request…” barrier was definitely a huge part of why this worked for us I think. I’d love to hear your results after you’ve tried it out

  • We have done this from day 1 for our Conversion optimization SaaS. It’s so important to let users test the product first before taking out the credit card.

    • Awesome to hear guys! Did you experiment with length of the ‘default’ free trial at all? 7, 14, 30, 60 days?

      • No, at the moment we have a 7 day free trial but most companies seem to be doing 14. So 14 is probably a better option.

  • mattwheelr

    Awesome article, and perfectly timed for us. Thanks Liam. Implementing that next week on Driftrock!

    • Awesome to hear Matt, look forward to hearing the results!

      p.s. Cheers for the positive words to David @ Forward 🙂

  • Just sent you an email Liam which this post has sort of answered! 😉 Did you consider a true SaaS “Pay as you Go” non committal model to encourage take up?

  • Really interesting read guys but one sticking point for me is 5) Action: n/a.
    In my mind that’s where the most action needs to be. Customer is onboard and happy to pay – how do you make sure they keep paying and are happy?

    • Hi Chris,

      I totally agree with you, and I realise I’ve been a little bit over simplistic on that part. You’re totally correct in calling me out on that!

      I meant to say the action we need to take to convert them into a paid user is nothing, as they’re already paying (which was the task at the time)

      However, *keeping* them paying needs a whole CSM strategy in itself.

      It’s something I want to write more about, however I’d be 50% writing from my experience at and 50% from previous industry experience and academic discussions. The reason being, doesn’t have 2years+ of historical payment data yet to draw conclusions from, so it’s very anecdotal along the lines of “No paying customers have churned yet and I think it’s because we did this”.

      But yeah, I’d love to explore the topic further!

  • Michael Sórm

    Hi Liam,

    I’m a bit confused with this, why would someone want to extend their trial by 3 days when they aren’t even ready to use the product?

    • Hi Michael,

      Good question.

      The general idea is that they have started using the product, but don’t feel like they’ve had enough time to test it. So a few more days gives them a bit longer.

      when users submit the extension, I do a manual follow-up with everyone and then can handle it on a case by case.

      People who just aren’t interested don’t bother submitting the form for an extension and just drop off…

  • 308Win

    Interesting stuff and a good argument for extending people who may not be there yet. Our app is direct to the public and we have a plan so that if the free trial period doesn’t convert I had thought of offering another free period just as an extension. The idea of getting some valid feedback as to ‘why’ they haven’t paid is a great one. If you lose them so be it, but getting an additional 33% in conversion seems like a sound investment and more importantly the knowledge gained helps you improve your product. Great post, thanks Liam.

  • Kate H

    Great post! We liked it so much we included it in Chargify’s blog post “Top 5 Ways to Increase Free Trial Conversions & Revenue.”

  • Yoav Aziz

    Really great post, loved it.

  • Very interesting post! Thanks for taking the time to share this feedback.

    On our side at sunnyreports, our funnel conversion during trial seems to have the same users’ segmentation but at the end, there is two main categories:
    – the people who find the tool is fine for their needs, pricing is ok and they buy it in the first 2 days,
    – the others (who have or not link their adwords accounts, who have or not sent a report, …) who are not convinced straight away and even after retrying through the emails we send during the 14 days free trial, they still are not in love with our app.

    When they want a trial extension, they ask us. It’s easier for us to help them in this phase.

    I have myself the same behaviour with all the SaaS tools I use every day. The first sight and the first feeling seem very important. If I need more explanations, I contact the support team.

    When speaking about a behaviour on the web, I always try to relate it to “real” life. Here, it’s like visiting your future home. When you feel comfortable because everything is where you would have put it in – or even better than you will have do it, you can directly project yourself in this home.

    I also love the way Buffer explains to their client why the service is not free (in each invoicing email): because they need money to exist. I need the service, I love the simplicity and I am happy to pay a fair price to continue to use it every day. It’s valuable.