New: flexible, templated dashboards for more control. Meet Dashboards

Learn / Blog / Article

Back to blog

App mistakes: the 10 lessons we learned launching (& killing) our $200K mobile app

If you’ve ever followed through with something you thought was a great idea, only to see it fail, then this post is for you.

You see, earlier this year, we killed our mobile app.

It was a painful process. After all, we had invested over 3,500 developer hours and $200,000 into it.

But because of four key mistakes that we made, the app was doomed from the start.

In today’s post, I’ll be sharing with you what those exact mistakes were, as well as the six steps that we would take today to make sure those mistakes never happen again (10 lessons total).

It’s the story of how we failed to take an idea successfully to launch, all because we were focused on the wrong things.

Last updated

18 Aug 2022

Reading time

13 min

Share

Table of contents

Part 1: the birth of a mobile app

When we launched Hotjar in 2014, our users loved the fact that they could now use Heatmaps, Recordings, Conversion Funnels, and Feedback Surveys to gain insights into their websites and web apps.

It didn’t take very long before they started asking if they could use the same tools on their mobile apps.

Now: to have Hotjar work on mobile apps, we’d need to build a Software Development Kit (SDK). Just like you add the Hotjar script to your website code to track actions on your site, you’d need to use a Hotjar SDK to track actions on your mobile app.

The problem was: website development and mobile app development are two different planets. None of our product developers had built an SDK before.

In fact, none of us had ever even built a mobile app.

So, the idea was to give ourselves a crash course in mobile app development by building our own app version of Hotjar. Then, once we understood enough about the platform, we’d move on to the SDK our users had requested.

We assigned two developers to the project and figured it would take a couple of weeks at most.

We didn’t realize it at the time, but we had already made mistake #1...

Mistake #1: we didn't validate the idea before we acted on it

Hotjar customers had been asking for an SDK, not a mobile app.

In fact, nobody was asking for a Hotjar mobile app. There were zero requests on Receptive (the tool we use to track our customers’ feature requests), zero user requests in app stores, and zero desire shown in our customer feedback.

Nothing.

We also hadn’t spent any time asking our users, “if we were to build a mobile app, what would you want to do with it?”

Had we asked, we might have learned that users mostly wanted to view recordings on mobile. But guess what? Because the tech that lets people watch Recordings in a web browser doesn’t work at all on mobile, we figured ‘let’s just scrap recordings and focus on the rest’.

Yes: the one thing people might have wanted was the one thing we didn’t do.

It was no surprise that when we launched the app in November of 2017, reactions were pretty lukewarm and retention started dropping right away. The majority of complaints centered around the fact that you couldn’t play recordings on the mobile app.

But that wasn’t the only mistake we made...

Don't make the same mistakes 🔥

Grab a free Hotjar trial and start collecting feedback so you can understand what your customers want and build what they need.

Mistake #2: we underestimated the resources required to build for mobile

Not only did we completely miss the mark with Recordings, we also underestimated the complexity of building a mobile app, and didn’t think about what it would take to maintain and support it.

As I mentioned, none of us had experience with mobile development. On top of that, we didn’t have any customer support people in place who were ready to help out. They’d be the ones facing brand new issues coming from our app users, without having the required experience and training.

The issues just compounded from there…

What additional security features would we need to include for mobile? How would we support it from a GDPR point of view?

We’d likely need a new team just to maintain the app. A dedicated product manager, a dev team, support people, plus marketers to promote it.

None of that had been planned out.

Mistake #3: we didn’t focus on what would have the biggest impact for Hotjar

At some point while the mobile app was being built, I joined Hotjar as a Product Manager.

Keep in mind, before me, Hotjar had ZERO product managers.

And while building an app to get to an SDK sounded like a good idea, there was no one there in charge of pushing back and/or asking the question: is this really the most important thing we could be focusing on?

When I came in, we started looking at the bigger picture. We asked ourselves, what’s missing in Hotjar as a whole that would have the biggest impact for our users?

As it turns out, it wasn’t a mobile app or even the SDK.

We saw certain areas for improvement in Hotjar like fixing heat maps, or supporting new targeting rules, that a large chunk of our users wanted, versus the minority who were shouting really loudly for the SDK.

GDPR was also coming around the corner, and aligning ourselves behind it became one of the core priorities of the product and development team.

It was really a case of not thinking things all the way through, which leads me to the next mistake we made:

Mistake #4: we didn’t think Minimum Viable Product (MVP) and lost sight of the original goal

Basically, once we assigned the two developers to the project, the team came up with an initial scope for the app, then committed to getting it done.

Without any feedback from our users (see mistake #1), we had decided to take all the features from Hotjar—minus Recordings—and put them into the mobile version.

Again, we thought this would take us a couple of weeks. Tops.

When that turned into a couple of months, no one stopped to ask, what was the smallest thing we could do to help us get this app out in the hands of our users as quickly as possible?

Also, keep in mind that the original goal was to get to an SDK. The app was just an in-between step.

We should have been asking ourselves, ‘what is the minimum viable product here that will help us learn about mobile as quickly as possible so that we can get to the SDK?’

We were so committed to building the damn thing as we had planned it that we chose not to downscope and cut things out.

Partly, this was a symptom of being an early-stage startup that was trying to keep up with rapid growth (a nice problem to have, to be honest). One of the two developers assigned to the app was our Director of Engineering and co-founder, Marc Von Brockdorff, who was essentially a one-man product team and also leading several other initiatives.

At this point, it was around October 2017 and the app was almost complete. So we decided to just ship it and see what would happen.

The death of an app

When we launched the app in November 2017, it didn’t get any traction. We sent out an email announcing the app to about 1000 users who had expressed interest in testing out the beta version (again, after we started building it).

As you can imagine, people signed in, played around for a day, then never came back.

Around February or March 2018, we were down to a maximum of 100 active users. We had much bigger, clearer priorities by this time, and the mobile app was now a burden that we weren’t willing to carry any further.

So in April, I removed it from the stores and closed any new user acquisition.

In May, we gave existing users a 30-day grace period and told them, "Hey, once this is over, we pull the app."

By June it was dead.

But it wasn’t all for nothing…

Part 2: if we were to do it all over again, here’s what we’d do differently

Our app crashed and burned. If we had asked the right questions in the beginning, involved our customers in the development process early, and listened to what they said every step of the way, the outcome would probably have been different.

Instead, we built and launched it to “see what would happen.”

But out of the ashes, some really important lessons emerged. And if we were to do it all over again, here’s exactly what we would do today:

How to go from idea to launch in 6 steps

Step #1: first, find out: "is this even a thing our customers want?"

If you came to me today and said, "Hey, Stefan, let's build a spaceship!"

I'd say, "Listen. I'm not sure if it's the thing we wanna do here. But let's ask our customers and see if they want that."

We’d start with some research and check all the places where we collect any kind of customer feedback: Receptive, Incoming Feedback, NPS® scores, surveys, polls, and customer tickets.

We’d check those first to see how often the spaceship theme recurs. This is what lets us get quantitative validation for an idea. If it didn’t show up, we’d drop it right there.

This is something we already do most of the time. Any time a customer requests a feature directly, we upvote that request inside of Receptive.

For us, it’s easy to see which features are the most desired by our users: the ones with the most upvotes. If something has around 400 upvotes, it’s time to start considering it.

If data showed us enough customers request a spaceship, we’d move on to the next step:

Step #2: validate the idea directly with our customers (and understand if it's a priority)

At this point, we’d send out a survey to our customers to get qualitative validation.

We’d ask them, “Listen, we’re thinking of building a spaceship. On a scale of 1-7, how important is having one for you?”

Quite often, customers might think they want a spaceship, but end up not using it once it’s built. So we’d also ask them to compare and prioritize it against other features we are thinking about building: “How important is having a spaceship versus creating dynamic heatmaps? Which one would you prefer and why?”

(Personally, I’d go for the spaceship. But I’m not Hotjar’s ideal customer!)

We’d also want to discover what specific problem our customers are facing, to make sure that the spaceship is the right feature to build in the first place.

If the survey responses showed that:

  • There is a specific problem our customers are facing

  • Building a spaceship would solve that problem better than any other feature

  • Our customers would value having the spaceship over other features

then we’d move on to the next stage:

Step #3: validate the user experience

The last thing we want to do is build something our customers hate or have no use for—yes, that’s what happened with the mobile app.

To save resources and build the right thing from day one, we’d build a prototype and get it into the hands of our customers ASAP.

This is also something we already do; we use tools like Sketch and Invision to create an interactive design of what a feature would look like and how it would behave. At this stage, there’s no coding involved. Just a design interface.

We usually run a few usability testing sessions where we share the prototype with five to six customers and get their direct feedback in one-to-one calls. Alternatively, you can use free tools like Fluid UI or Justinmind to both design the prototype and gather feedback.

We ask them to complete certain tasks (e.g., “how would you access the menu from here?”) and observe how easy and intuitive it is for them. If it takes them a second, we’re on the right track. If they sit clicking around trying to figure it out, it needs tweaking.

We also use a Customer Effort Score (CES), where customers tell us how easy it was to complete the task on a scale of 1-10.

Finally, we wrap up these user tests by asking people general questions such as:

  • What's your take on this?

  • Does it solve your problem?

  • Are you happy with it?

  • Anything you’d like to add?

This is an iterative process, so we usually do one round of five to six calls, make improvements based on the feedback, then have another round. The goal is to get to the point where overall sentiment is positive and we have a high CES score of 7-8. Usually, it takes about three rounds. 

Step #4: consider the full cost

Before going all-in and starting to build a spaceship, we’d need to scope out what it’d really take to get it launched.

I’m not just talking about development hours. I’m thinking in larger terms: would we need a dedicated product manager? More support people? Someone in marketing to promote it? More engineers?

Before starting our Minimum Viable Spaceship, we’d have to look at everything we’d need to do, including hiring or training the right people.

Only after doing that, we’d move on to defining what the MVP would look like.

Step #5: define the MVP

Despite the mobile app fiasco, this is a part of the process that we’re also quite familiar with already. It’s where we take a hard look at the ‘nice-to-haves’ vs. the ‘must-haves’.

It’s not an easy process, but we approach it by asking, does this [feature/tool/element] solve the problem directly?

If it does, it’s a must-have (like a door to the spaceship).

If it doesn’t, it’s a nice-to-have (like can the spaceship look like the Millennium Falcon?).

The important thing at this stage is to trim out any extra fat so we can get something out and start collecting feedback. The earlier we can get the customer feedback loop started, the faster we can improve the product.

We skipped this step with the app, and wasted a lot of time on a project that took months to build, and months to fail.

✏️ Note: Having a product manager dedicated to asking the hard questions at all stages helps to ensure a much more efficient process.

Questions like: “Hey, is this what we originally imagined? Is this taking longer than expected? Do we need to change course?”

Our CEO, David Darmanin, agrees that “we should have hired product managers much earlier. When you don’t have product managers, and you’re a small team, you don’t have the focus to make good decisions. You tend to have these meetings where you make stupid decisions and fail to see the big picture.”

So at this point in our spaceship enterprise, we’d know what needs to be built to get it out as quickly as possible, and we’d be ready to build and launch a beta version.

Step #6: get the MVP in the hands of our customers

Once the MVP spaceship had been designed and built, we’d test it with a small percentage of our customers to see if we got it right. For a large product like a spaceship or a full mobile app, we’d be looking at about 10% of our users; for small tweaks and add-on features, 100-200 people would be enough.

Just like in the experience validation stage (step 3), we’d have one-to-one calls to see how our users reacted to the new feature. We’d also send out surveys to gauge the user experience.

The rule of thumb is this: once 10% of our customers are rating the feature 7 or 8 out of 10, we’d think it safe to launch to our entire customer base. And, because we’d asked the right questions throughout the entire validation process, the score we’d get at this stage would likely match the one we’d get when going to market.

But of course, the process doesn’t end with the launch. Throughout the beta testing, launch, and life of a product, we’d constantly listen to customer feedback and improve the product. And, yes, fail—and improve again.

Don't make the same mistakes 🔥

Grab a free Hotjar trial and start collecting feedback so you can understand what your customers want and build what they need.

Our app died, but it gave birth to a new process

We failed, and it took a long time and a lot of resources. But we definitely learned our money’s worth.

Our customer validation process now ensures that we don’t waste time or money chasing ideas we think are good. We know they’re good and they’ll only get better—because they came from our customers.

As our CEO and founder, David says:

“We’ve made this mistake a few times. You have to do it a few times ‘til it registers. But what it comes down to is: if you’re going to do something, it needs to be something you truly believe in. It has to be part of your roadmap. It has to be something that delivers value to everyone."

David Darmanin
CEO at Hotjar

Net Promoter, Net Promoter System, Net Promoter Score, NPS and the NPS-related emoticons are registered trademarks of Bain & Company, Inc., Fred Reichheld and Satmetrix Systems, Inc.

PS: want to receive more behind-the-scenes stories from Hotjar? Enter your email address below: