14
49 Comments

Do you write tests?

Would be interesting to hear how other indiehackers test their code.

I add unit tests for important parts and include some integration tests for important user flows as well. Love seeing the green checks on my CI/CD.

submitted this link to
Developers
on September 1, 2022
  1. 12

    I let the users test it for me, LOL.

    1. 3

      Me too. And I don’t get complaints at all from my zero users :-)

    2. 3

      That's basically how I do it. If I don't get a bug report then it's probably not a big deal 🤣 The only reason I might start adding tests is when I have other devs contributing to my code base. Dealing with code you didn't write is very intimidating if there aren't tests.

  2. 6

    For MVPs writing tests for anything that's not critical is a waste of time. At that stage your biggest risk isn't shipping bugs to production, but building something nobody wants.

    Once you get paying users, you obviously want to write tests to avoid shipping stupid bugs.

    1. 5

      I'm drawing entirely from experience in enterprise/FAANG situations but my observation is that it's almost impossible to go back and add test coverage retroactively - it will be difficult to justify the time then as well. Plus, nobody wants to spend a sprint or so just writing tests.

      I've come to form the opinion that ease of writing tests - specifically unit tests - is a good proxy for clean code. That is, if you find that writing unit tests is too time consuming you might be writing code that you may one day find difficult to maintain. Of course, if you're going after a prototype + throwaway, then it changes the conversation.

    2. 1

      Agree! Ship fast, test the important stuff.

  3. 5

    I write tests for two reasons:

    1. To tell me that the changes I made resulting in expected outcome
    2. To avoid manually testing things

    Those two really help speed things up.

    The tricky part is to decide when not to test.

  4. 5

    Yup, I do write tests. Out of habit, but I think it helps when the features get more involved and more things could break.

    It does feel like I’m slowing myself down, but it’s an insurance policy to protect my users from bad coding.

    1. 2

      Definitely helps! I think it actually makes me move faster, as I don't have to test as many things manually. I can run all my unit tests in a few seconds.

      I also code in Clojure, so mostly pure functions and those are easy to test :).

  5. 3

    I write tests when it helps development run faster - if I'm writing an algorithm or parser or something like that - any sort of "data in, data out" function, it makes so much sense. Just like Jest in watch mode for testing TypeScript is a really nice way to progress.

    Integration tests, though - I've written them, and they're terrible and slow down development. For simpler applications, fine, they work well, but on every project I've ever worked on through 3 jobs, I've never seen them have much of a ROI.

    1. 1

      Same here. Most of my functions are pure so it's super easy to write inline test cases. Clojure makes this super easy :).

  6. 3

    Yes. For Stackless, being a 'business platform' it must be fully tested. However, because I'm a single person writing an entire cloud development platform, I've optimized my use of automated tests so I'm not doubly testing anything except the most low levels of the system.

    1. The core, the engine code that interacts with V8 and RocksDb has 100% code coverage using cpptest Unit Tests.

    2. The network IO code also has load and fidelity unit tests that run at the same time as the engine.

    3. The front-end runtime TypeScript packages (called the SDK) are both smoke tested via an automated suite of tests that ensures all the use cases I've marketed in the public examples work as advertised. Basically, the example applications I've built are tested thoroughly before a new version of the SDK is published to NPM.

    Coupled with my penchant for developing defensive code, I believe this strategy has provided me with a happy medium between productivity and risk.

    The downside to this approach has been that it's a little tough diagnosing bugs found by the smoke test. But, the extra architectural agility I'm afforded by constraining my direct testing surface area to mostly the "implicit contract" my users expect has been worth the occasional pain.

    1. 2

      Makes sense. Well done on achieving 100% code coverage!

      Just curious, what coverage model are you using?

      1. 2

        I don't know what coverage model Clang is using. Here's what I did in CLion to show coverage: https://www.jetbrains.com/help/clion/code-coverage-clion.html#run-coverage

  7. 2

    Yes, I write tests, but at others mentioned, it depends on a number of things:

    • The nature of the project. A tool that earns you money is arguably something that shouldn't break. A mature product that will see major refactorings should also be tested

    • the code to (not) test. I tend to write unit tests that are simply quick to write (think: utility methods), and tests for user workflows. Another thing is external APIs that may break your app even without you doing anything, so that's something I'd test. And last but not least, I test code that was not trivial to get right, in order to prove that it's correct.

  8. 2

    IMO - Basic unit tests are important and may save you time in the future, but be careful because it is easy to create unit tests with a lot of green checks even if your App is not working.

    Those bullets should be your top priority, even before you invest time in unit tests:

    1. Catch every exception, and add logs to your app.
    2. Know your site metrics - for example, If there are Y users on the site with a %X conversion rate. Make sure the numbers are not drastically changed in newer versions.
    3. Send your site to friends after every major upgrade. You will be surprised how many things they will find.
    4. Use Hotjar (hotjar.com) to track your visitors and make sure everything is working. You can also add a feedback button, but most of the time visitors won't do the effort to tell you about bugs.
    5. Pay a few dollars to find someone on Fiverr for the QA. Pay him a bonus if he finds a bug, and trust me - he will.

    Advanced (but must):
    Write a test that mimics the normal user and that tests the most critical parts/APIs of the site.

  9. 2

    Absolutely, 100%. It's 2022, not 2002. I believe tests are making you faster, not slower. I also write all kinds of tests from unit to system depending on what I think I need to cover. However, I never obsess about code coverage tools and numbers.

  10. 2

    first test then code

  11. 2

    Some of the numerous projects I never published I wrote TDD, used fancy new stuff and really wanted to make everything great. However, I promised myself to stay as MVP as possible in the future to validate. Boring tech, ugly code, no tests :)

    1. 1

      Sure! I think you can move quickly with some tests though. Especially for core parts and important logic. But fully tdd can be kind of extreme, especially when testing frontends and things like views. Things become very brittle and it really can slow you down.

      1. 1

        True. And sometimes it is also just an amazing exercise to refactor some parts, add tests and so on. Especially if you want to move more quickly in the next projects.

  12. 1

    I write shockingly few tests—indie or day job. I don't claim to write the optimal amount/proportion of tests, but I find that most bugs come from not understanding the problem, the workflow, and/or the edge cases, and if you don't understand those, your tests won't be any better than your application code.

    The tests I like most are almost at the extreme ends:

    • Cypress browser tests that cover an entire user workflow
    • fast, terse tests to cover as many cases as practical of libraries meant to support flexible usage
  13. 1

    For indie projects few tests. Mainly I would test an algorithmic thing. Rarely would I test a DB/UI thing. Bang for buck on the algorithmic thing is way higher (more likely to go wrong and way more easier to test)

  14. 1

    i dont know much about this, but i am trying this on this website https://thenewsaddicts.com/

  15. 1

    In my experience, my meticulousness in writing tests depends on where my project is.

    I tend to look at projects on a scale from "no one would use this even if it were free" to "I am the power company" (i.e. I control a thing that no one can live without - I've never actually made a project like this haha).

    Before I've validated that people would use it at all (usually pre-beta test), I tend not to write very many, if any tests. Code that no one ever uses is guaranteed not to have bugs :)

    But usually if I'm building a project for myself or if I have some beta users who seem to like it, I'll test more complicated logic or utility functions. For instance, one of my projects involved a web crawler that needed to normalize links for pages that may or may not have canonical links in their HTML - I wrote unit tests for that.

    More extensive end to end and integration tests require a lot of upfront work, and in my opinion, it isn't worth guaranteeing you haven't broken anything until you've guaranteed that people will actually pay you for your project. Once people are paying you, I believe you have a definite obligation not to break their stuff.

    1. 1

      I do pretty much this. I don't write tests at first. Now that some people are using my SAAS, I write unit tests for core features.

  16. 1

    Yes, for most of my applications, except small prototypes/MVPs where I need to quickly validate something.

  17. 1

    I'm writing top-level feature and integration tests

  18. 1

    I write some E2E and much else, especially pre-launch.

    I do love TS, though, I've been writing it since 2016, and it doesn't particularly slow me down; with that said, I wouldn't recommend it to beginners who want to launch something quickly, at least not using strict configurations.

  19. 1

    I think a lot of the adversion to writing tests is that devs haven't put together their test stack. Which means that testing and setting up the test infrastructure becomes more of a burden.
    https://teatv.ltd/
    https://hellodear.in

  20. 1

    I think a lot of the adversion to writing tests is that devs haven't put together their test stack. Which means that testing and setting up the test infrastructure becomes more of a burden.

    This is totally fair, btw, pushing features is your lifeblood, but id be interested to see if barriers to setting up testing (time cost) which could be addressed by having a more off the shelf testing set-up.

    Spend some time researching the test stack for your language and framework to make testing less intimidating :).

    Even on MVPs I'll still do testing (obv I'll cut corners) because it feels like a piece of the code that following the Pareto principle puts you in a great spot!

    My test stack:
    I have a much less mature typescript test stack but here is my RoR off top of my head:

    Rspec
    Capybara
    Factory Bot
    Faker
    VCR - makes mocking requests a breeze...this was a life changer for me
    DatabaseCleaner
    Selenium webdriver

  21. 1

    It's important to recognise that the more tests you write, the less users you will get, and vice versa ;)

  22. 1

    I do write feature tests.

  23. 1

    As someone who's in the validation stage (but also a developer and just barely resisting the urge to start building), I've been thinking about this a lot lately. Because this is a nights/weekends project for me, I keep thinking I should forego tests until I'm at least in beta. On the other hand, I'm not sure I'll be able to actually sleep at night during a beta if I don't have at least some critical paths tested ahead of time.

  24. 1

    Recently I've started diving into Cypress (https://www.cypress.io/) and have found it surprisingly easy to properly test with.

    For me, you test to give you more confidence in you code and shipping. Early stage with no users, it's something you can easily neglect. But as more people start using the startup, you want to test to avoid those cold sweats when a user flags an issue in your app.

    I generally focus on having at least some integration tests for the mainline functionality plus anything that's close to the conversion funnel for my startups.

  25. 1

    General answer to general question: yes. Test your shit before customers see it.

    More nuanced answer: it depends. My thought process is usually around two pillars:

    --

    Development speed: will writing tests help me go faster? The answer is often yes, if you structure things a certain way. There are cases where it only slows you down, and it takes experience to understand the difference.

    Risk management: what are the likely outcomes if I ship buggy software? Specifically, what's the likelihood of something going wrong, and what is the impact? This really comes down to your market: how sensitive are they to failure; and your software delivery: how quickly can you release a fix?

    ---

    Personally, I test my primary business domain code the most for correctness/documentation (closer to BDD) and development feedback speed, and end up with integration tests on every endpoint I care about.

    Huge +1 to leaning into your type system (note: this is a skill you must practice) and removing a large class of tests that make your code fragile.

    I left a more detailed reply on reddit, re: QA the other day as well for more general thoughts on QA.

    Just because these are start-ups / solo developers, the thought process should be the same. The only thing that changes is risk management trade-offs. Manage them risks!

  26. 1

    I wrote a test function that I can turn on and off. I did this because my app has a ton of functionality that was hard to test manually, but easy to test with a simple function.

    When I make sweeping changes, I turn on the test function, then turn it off after the changes are pushed live.

  27. 1

    TypeScript is great. Type annotations means that you don't need to write tests just to ensure you haven't misspelled a parameter or misremembered how to dig deep into some structure you're using.

    The "my code is 50% type annotations" is...FUD. Plain and simple. React does make you write a bit more by way of type annotations than, say, Node.js backend code, but as soon as you get into anything complicated the type annotations will save your bacon many times over, more than paying you back for type-heavy tiny components like the above.

    As to testing: I'm simply very selective about what I test. I wouldn't likely have a test for a component like the above, for instance. It doesn't even really make sense to test it in isolation; what are you testing? That the img tag works? I might have tests on larger components or on full pages, but not on every tiny little function.

    I've been programming for a very long time. Coming up on 40 years, I think? I have a very good sense of both where bugs are most likely to live and where they're most likely to be a challenge to find. In those places I put a lot of test coverage, as well as runtime asserts and logging if the code encounters unexpected states.

    When I've been involved in projects with extensive tests, 99% of the tests seem to simply never fire. Yes you get lots of little green check-marks, which is satisfying. But each one of those checkmarks, in addition to representing a test, also represents technical debt. Any part of your code that's thoroughly tested at a low level is that much more resistant to being modified. If you only have a small number of tests to verify high level functionality, then changing how your app works is actually easier.

    Refactoring is easy with static types, too. About 90% of the time I can perform major refactoring, moving a lot of code around and improving it as I go, and at the end it just works the first time after it compiles. The other 10% of the time it takes a minor tweak or two to work.

    Using extensive static types in TypeScript or otherwise is a super-power. Dynamic types are a lie: They help you get started at the very beginning and then as the project progresses they add more and more technical debt. You get behind on documenting code (the types in a static-type language represent a minimal required level of known-good documentation); you forget how things are supposed to work because it's not all documented in one place and so structures grow in awkward ways; code that you thought was obvious at the time now takes you an hour to figure out before you can add that feature, where if the types were specified everywhere you could just have made the change in 10 minutes; the list goes on and on.

  28. 1

    I have a bad habit of not writing tests until I start to get annoyed at myself for regressions.

    And then by that time I've realized I built something no one wanted, so I'm glad I didn't write tests...

  29. 1

    It really depends on what the product is. If the product is straight-forward enough, writing tests for the very first release might be a lot of wasted time on your part, especially if you have zero users and expect to change everything in the near future.

    Once you:

    1. Know what you want to go forward with
    2. Have people using your product
    3. Expect that the product will eventually become complicated

    ...then you might want to look into writing a good test suite to make future updates safer. Otherwise I'd say it's more optimal to spend this time developing the product itself. I'm doing exactly that with my current product; I shipped an initial release with zero testing, and now that I'm starting to have users, I'm considering taking a step back and safeguarding it before continuing.

  30. 1

    TypeScript is great, but type safety does not necessarily guarantee runtime safety.
    I tend to write tests regularly, they are another tool for me that gives me confidence to refactor stuff.

    On the other hand, tests are the perfect tool for me to debug things.
    A test is the perfect setup for debugging.
    You have a fixed set of input and output values, paired with an easy to execute, reproducible setup.
    Makes stepping through code really easy!

    Last but not least, every time I encounter an issue in production, caused by an edge case or a regression, I write a test for it to confirm it's still fixed in future releases!

    EDIT: I forgot one really awesome thing. A meaningful set of tests is pure gold when searching for a broken commit using git bisect

  31. 1

    As a general rule of thumb, I would only write tests that can negatively impact users. This means that having users who actively use the product is a prerequisite to spending any time on test automation.

    To decide what features to test and how, let's take my product, Makery as an example.

    1. Identify critical features
      Makery is a static site builder, so updating websites and publishing the changes can be considered critical. I host Makery's landing page on Makery and I make changes quite frequently. This means I basically manually test the site update process every time I publish something new.

    2. Pick features that you don't cover usually by yourself
      I only have one website, which means I create new websites less frequently. Website creation actually involves provisioning new resources in third-party services, saving new records, creating build artifacts, and publishing them online. I would consider this use case a good example to write a test for.

    3. Optimize for customer value
      The best tool to pick usually depends on everyone's experience and background, so there isn't any good or bad choice here. One important thing to consider is that you want to optimize for customer value and speed, not for technical challenges.

    So if I get back to my example, my goal is to make sure that creating a new website works. So it is completely fine to add let's say a Cypress test that clicks the Create website button before every release, even if once in a while I have to manually delete a couple of resources.

  32. 1

    As the tweet says, Typescript is awesome! I use types extensively to pin down invariants in my code to let the type system do the heavy lifting. Types replace all the 'defensive' unit tests one would otherwise have to write in JavaScript to feel comfortable.

    As far as what I do actually test for my pre-MVP project, I write tests for the little utilities and types to make sure they work (these are small unit tests). Otherwise I'm doing manual testing and any gaps I find there I look to incorporate into the type system.

    I'm writing code that can be tested up front. I definitely have integration, E2E, and property-based testing on my radar as I write any code. I will want that eventually.

  33. 1

    I write backend tests (not familiar with frontend testing). I try to keep coverage above 85% and ideally over 95% if the product gets more validation. I do this for MVPs as well because it's easier to spot errors in my code -- if I go back to the codebase after a while, tests also help me understand the default behavior + edge cases easier.

  34. 1

    Yes, and I have a CI/CD pipeline setup as well. I want to push my code, and focus on other things, while I know the testing of critical features is being automated.

Trending on Indie Hackers
I've built a 2300$ a month SaaS out of a simple problem. 19 comments 🔥 Roast My Landing Page 12 comments Where can I buy newsletter ad promos? 11 comments How would you monetize my project colorsandfonts? 8 comments Key takeaways growing MRR from $6.5k to $20k for my design studio 6 comments How I built my SaaS in 2 weeks using NextJS and Supabase 5 comments