Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Document guidelines on testing #123

Open
sanderploegsma opened this issue Feb 2, 2024 · 1 comment
Open

Document guidelines on testing #123

sanderploegsma opened this issue Feb 2, 2024 · 1 comment

Comments

@sanderploegsma
Copy link
Contributor

I think we should come up with some guidelines on how analyzers should be tested. My proposal would be:

  • Unit tests should exist to obtain as much code coverage as possible. Correctness of each exercise analyzer should be tested that way.
  • Each exercise should have at least two smoke tests: one with an optimal solution that receives no feedback, and one with a solution that receives at least one exercise-specific comment.
  • Next to that, a few smoke tests to cover exercises for which no analyzer is implemented should be present too, to make sure the analyzer works properly for every exercise. By that I mean that it shouldn't crash or something.

Once we come up with some concrete guidelines, we should probably write them down in the docs.

Originally posted by @sanderploegsma in #122 (comment)

@sanderploegsma
Copy link
Contributor Author

Also I would like to propose we use the ApprovalTests library for the unit tests, similar to how I set up unit tests here: exercism/java-representer#109.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant