16
8 Comments

A more practical QA process for indie hackers

I've tested a lot of products in my day. It comes with the indie hacking territory. Plus, smoke tests were a necessary evil back when I was gainfully employed — even with a QA team, bugs were common.

A buggy product can be a death sentence for a new business, but indie hackers don't have QA teams to lean on. And most of us don't have the time (or expertise) to test like the pros. So I think it's important to find a balance — a minimum viable amount of QA/QC (quality assurance and quality control).

So here's the practical approach that I've learned from experience, research, and comments from other indie hackers.

QA processes by stage

Here's a list of what should be involved in the QA process — specifically for indie hackers. It will differ by stage, so I'll break it into the MVP stage, the v1.0 stage, and then successive updates.

QA process for MVPs

MVPs should be bare-bones, and the same goes for their testing. Keep it simple and save the fancy stuff for later. Things like unit tests are investments that are expected to pay dividends in the future, but prior to your MVP, you don't even know whether your product will have a future. Time spent on this type of thing will be wasted more often than not.

According to @itsbalal:

Validating and going fast on the market does not go along with having the healthiest codebase… All of the successful projects I've been working on have had low coverage, as the people started catching up on testing only after some ground was secured.

The exceptions here are products for which "viability" is held to a higher standard — for example, products in healthcare, finance, etc.

Test planning and prep for MVPs

Put a little bit of time into planning before you get started on your MVP… but not a ton of time.

  • Set requirements: A lot of indie hackers skip this part and just start building, but taking a moment to plan your requirements and specifications before you build will result in better code and easier QA.

Test execution for MVPs

While you're building, review your code and make sure it is satisfactory. Some technical debt is okay at this point, but it's got to be in reasonable shape. Then test.

  • Manual testing: When you complete a feature, test it. This obviously includes testing it for bugs, and should also include usability testing. Does the build meet the needs of the end user? Refer to your requirements.
  • Document bugs: Batching QA is the most efficient way to go, so don't fix a bug right when you find it. Document it and keep testing. For the MVP, a Google Sheet or pen and paper will do just fine.
  • Fix the issues: Not all bugs are created equal. Fix the highest priority issues first. Some may not even need to be fixed for the product to be minimally viable.
  • Verify the fix: Test it again to make sure you actually fixed the issue.
  • Other tests: Load/performance testing and security testing can be a great idea at this point, just to be safe. Tools are below.
  • Smoke test: Use the entire product in every way you can think of. Try to break it. Document and fix any issues. This is the most important test for indie hackers — particularly for MVPs.
  • Release: When the product is viable, prepare your launch and validate your product.

Yep, that's it. You may have noticed that there's not a single automated test in there. Just build it and test it as a user.

QA process for v1.0

Once validated, an MVP often gets overhauled to make it more attractive, stable, and scalable. This is what I'm referring to as v1.0 — it's the build where you're really ready for prime time. And it also happens to be the build where a robust QA process is most warranted IMO (even for indie hackers).

Test planning and prep for v1.0

This time around, spend more time up front planning and prepping for the QA process.

  • Set requirements: Per above.
  • Set up bug tracking: For the MVP, it's pretty easy to shoot from the hip. But now, it's time to set up the process in a way that makes you more efficient. At this point, a Google Sheet may still be fine, but it could be worthwhile to look into testing-management or project-management platforms. More on those below.
  • Prepare a staging environment: Since you've been validated now, you should have paying customers. So don't be the founder who tests their build in a live environment — that doesn't fly post-MVP. Set up a staging server so that you can test on your end before going live, and make sure it matches the production environment in terms of setup.
  • Design your tests: The advice online gets a little excessive here, so go easy — just document (or at least think about) test cases and flows that need to pass testing. For example, it's probably a good idea to ensure that users can register, log in, pay, and use your main features.
  • Set up unit tests: Unit tests and integration tests (interaction between units) are helpful, but I know a number of indie hackers who don't touch the things, regardless of the stage — they view them as a waste of time. My two cents? I think they can be a good idea post-validation. You just have to decide whether the cost (time/money) of creating and maintaining them is less than the value you'll receive from them. There are lots of products that help with this (see below).

I thought a comment from @lwrightjs drove this point home:

I 100% disagree with the premise that you need tests. Tests are about understanding your costs. If it costs you less to fail than it does to write a test, then you don't need a test. If it costs more to write a test than it does to test it manually, then you don't need a test… Sometimes not writing tests is technical debt, other times it's a technical investment.

As far as what to build tests for, here's @typologist:

For indie hackers, my approach is simple. Write tests for...

  • Critical parts, the ones which would cause your operations would be heavily affected
  • Complex parts, that might be really hard to debug and fix in the future
  • Everything else can perfectly wait.

Test execution for v1.0

Then get ready for some pretty in-depth testing.

  • Static testing: This essentially refers to doing code reviews — making sure your code is up to scratch, and that it should function as intended. When you finish each feature, review the code.
  • Execute automated tests: Whatever tests you wrote, run them. Do this before you do any manual testing.
  • Execute manual tests: At this point, you should have your test cases lined up, so act like a user and work through the flows. I find it helpful to record my screen because it's super annoying when you find a bug but you don't know exactly how to replicate it. And remember to do your testing in the staging environment.
  • Browser testing: It's hugely important to test on multiple screen sizes, operating systems, and browsers. Simulators won't cut it. I'd suggest using one of the browser testing tools listed below.
  • Document bugs: Per above. Consider documenting on a free tier of a testing-management or project-management platform.
  • Fix the issues: Per above.
  • Verify the fix: Per above.
  • Regression testing: Assuming you are building from your MVP (not from scratch), Do some regression testing. This means testing key features that were previously functional, even though they weren't touched. Sometimes, messing with one thing messes with another and suddenly things go awry. Just do a quick sanity check to make sure everything is still working.
  • Smoke test: Per above.
  • Beta testing: Totally optional, but a beta test can be a great idea. Friends and family work fine for most. But there are also services below that can bring in some strangers.
  • Release: When it's ready, prepare another launch and get it out there.

QA Process for updates

And then there are the updates. The v1.01s and the v2.0s and on and on. Assuming that your product is stable, the QA process for these can once again be fairly minimal IMO.

Test planning and prep for updates

At this point, you've probably got testing set up for previous versions of the product, so this time around, you just need to add to the list.

  • Set requirements: Per above, but for the new features only.
  • Design your tests: Per above.
  • Set up unit tests: Per above.

Test execution for updates

  • Static testing: Per above.
  • Execute automated tests: Per above.
  • Execute manual tests: Per above.
  • Browser testing: Per above.
  • Document bugs: Per above.
  • Fix the issues: Per above.
  • Verify the fix: Per above.
  • Regression testing: Per above.
  • Smoke test: Per above.
  • Release: When it's ready, prepare another launch and get it out there. You can (and should) launch multiple times IMO. I wrote about that here.

Some QA best practices

  • The best way to uncover any issues is to eat your own dog food. You built the product, now use it. A lot. Use it for what it's built for. Use it for other things too. And while you're at it, occasionally switch up your flows so that you can test in new ways.
  • "Shift left", meaning test earlier in the dev process.
  • The best way to assure quality is to write quality code. Do it right the first time.
  • Keep sprints and release cycles short.
  • Identify the biggest risk factors in the product and focus QA efforts there.
  • If you've got the cash, outsource. You can and should still do some testing, but consider bringing in a pro as well.
  • Use trusted third-party solutions whenever possible — don't build something for payment, for example. You'll save time and you won't put yourself and your customers at risk.
  • Learn from the issues that you find.

QA tools and services

There are about a bajillion QA tools out there, but here are the basics of what an indie hacker might want:

For tracking, I've found project management platforms to do the trick nicely:

  • Jira: Jira is super popular for bug tracking, and for good reason. I'd say this is the go-to.
  • Monday: I'm personally pretty fond of Monday, and I've tracked many a bug there.
  • Trello: I know a lot of indie hackers use this. It's pretty bare-bones, but the free tier will get you where you need to go.
  • Testing management platforms: I've never personally used these, but products like Xray for Jira, TestRail, Testiny, QMetry, and PractiTest are built specifically for bug tracking, not project/product management. Probably overkill, but many of them are capable of automation, which might make them worthwhile.

Speaking of automation, if you want to set up automated tests, here are the top options:

  • Selenium: Probably the most popular name in test automation. It enables you to automate browsers.
  • Cypress: End-to-end JS testing framework.
  • Appium: Automated testing for mobile apps.
  • ACCELQ: Codeless test automation tool.

Load and performance testing:

  • Jmeter: Probably the most popular open-source load testing.
  • Blazemeter: Continuous load and performance testing with a free tier that will cover most indie hackers for a while.
  • LoadRunner: Load and performance testing. There's a free trial if you just need a one-off before launch.

Testing services:

  • UserTesting: The most popular user testing site, and I've heard great things. Slightly more expensive than other alternatives but they're able to hit many demographics with their testers.
  • UserBrain: This is similar to UserTesting, but from what I've read, it's a little cheaper.
  • Outsourcing: Contracting a QA to help out can be a really great idea if you've got some revenue. You can find qualified professionals on places like Upwork and Fiverr. They usually cost $25-45/hr from what I've seen. And they'll find things that you (and the people from the services above) won't find.

Beta testing services:

  • BetaList: Community of makers and early adopters showcasing their startups and exchanging feedback.
  • BetaPage: Community of tech lovers and early adopters where you can submit products.
  • Betafy: Provides a place for founders to get relevant feedback from a community of startup supporters.
  • r/AlphaandBetausers: Solid subreddit for beta testers.

Security testing:

  • Vega: Free open-source vulnerability scanning.
  • W3af: Open-source web app audit and attack framework.
  • Sifter: Penetration testing tool originally built by @garrettdimon.

Free resources

If you want to dig a little deeper, here are some free resources:

  • Free Code Academy's QA course looks to be pretty comprehensive.
  • Free Software Testing also has a course.
  • And here's their free ebook.
  • If you've got 10 hours to kill, here's a very long (and I assume comprehensive) Youtube tutorial by Edureka.

You can really go down the rabbit hole with QA/QC, so I tried to keep this as simple and practical as possible. What did I miss?


Subscribe for more tips, how-tos, and case studies. 👇

  1. 2

    Mom Test by @robfitz is another great resource. Super-short and super-practical.
    Check it out https://www.momtestbook.com/

    1. 2

      Yeah, that's a really great book. Thanks for mentioning it!

  2. 1

    Thanks for the in-depth article.

    However, are these steps still applicable in today's no-code environment? Of course, some testing is always needed but if you have a no-code (low code) built product are code and other testing really important?

  3. 1

    This is a great post, and something I wish I'd read a year ago. One thing I've found super valuable in the absence of automated testing is maintaining good logging practices.

    Bugs are always going to slip through and when your users come to you for answers, having good logs gives you a much better insight into exactly what happened, or in an ideal case, lets you catch the bugs before users even realize they're there.

    Sure it takes a bit of time investment, but its also something that can be useful during development as well. Also monitoring tools like sentry.io are usually pretty quick to set up and give you that extra bit of insight into whats going on.

  4. 1

    My testing process is very simple nowadays. For some complex methods where I'm not fully sure about the outcome, I write unit tests. For everything else, I have my own product, Testkit set up to run through some user flows with assertions for me.
    Best thing about this is that I can break things without worrying, as Testkit automatically fixes broken tests for me :)

  5. 1

    I've never tried beta testing my products. I get that it's a good way to get feedback and and early adopters, but it seems like it just delays the launch. Why not treat your launch like a beta?

  6. 1

    Yeah, I never waste time on QA for my MVPs, so I agree — makes sense to vary your QA efforts according to the stage.

Trending on Indie Hackers
Passed $7k 💵 in a month with my boring directory of job boards 35 comments Reaching $100k MRR Organically in 12 months 32 comments 87.7% of entrepreneurs struggle with at least one mental health issue 14 comments How to Secure #1 on Product Hunt: DO’s and DON'Ts / Experience from PitchBob – AI Pitch Deck Generator & Founders Co-Pilot 11 comments Competing with a substitute? 📌 Here are 4 ad examples you can use [from TOP to BOTTOM of funnel] 10 comments Are you wondering how to gain subscribers to a founder's X account from scratch? 9 comments