Testing And Benchmarking in Miranda

Here’s the translation of the Go testing and benchmarking example to Java, formatted in Markdown suitable for Hugo:

Unit testing is an important part of writing principled Java programs. The JUnit framework provides the tools we need to write unit tests, and we can use tools like Maven or Gradle to run tests.

For the sake of demonstration, this code is in a single file, but it could be split into separate files. Testing code typically lives in a separate directory from the code it tests, often in a src/test/java directory.

import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.DisplayName;
import org.junit.jupiter.params.ParameterizedTest;
import org.junit.jupiter.params.provider.CsvSource;
import static org.junit.jupiter.api.Assertions.*;

public class IntUtilsTest {

    // We'll be testing this simple implementation of an integer minimum.
    public static int intMin(int a, int b) {
        return a < b ? a : b;
    }

    // A test is created by writing a method with the @Test annotation.
    @Test
    @DisplayName("Test basic intMin functionality")
    public void testIntMinBasic() {
        int ans = intMin(2, -2);
        // assertEquals will report test failures but continue
        // executing the test. assertThrows or fail() can be used
        // to stop the test immediately.
        assertEquals(-2, ans, "intMin(2, -2) should be -2");
    }

    // Writing tests can be repetitive, so it's idiomatic to
    // use a parameterized test, where test inputs and
    // expected outputs are provided as parameters.
    @ParameterizedTest(name = "intMin({0}, {1}) should return {2}")
    @CsvSource({
        "0, 1, 0",
        "1, 0, 0",
        "2, -2, -2",
        "0, -1, -1",
        "-1, 0, -1"
    })
    public void testIntMinParameterized(int a, int b, int expected) {
        int result = intMin(a, b);
        assertEquals(expected, result, 
            () -> String.format("intMin(%d, %d) should be %d", a, b, expected));
    }
}

For performance testing in Java, we typically use the Java Microbenchmark Harness (JMH). Here’s a simple example of how you might benchmark the intMin method:

import org.openjdk.jmh.annotations.*;
import java.util.concurrent.TimeUnit;

@BenchmarkMode(Mode.AverageTime)
@OutputTimeUnit(TimeUnit.NANOSECONDS)
@State(Scope.Thread)
public class IntMinBenchmark {

    @Benchmark
    public void benchmarkIntMin() {
        IntUtilsTest.intMin(1, 2);
    }

    // To run this benchmark, you would typically set up a separate main method
    // or use a build tool plugin that supports JMH.
}

To run the tests, you would typically use a build tool like Maven or Gradle. With Maven, you might run:

$ mvn test
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running IntUtilsTest
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.086 s - in IntUtilsTest
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------

To run benchmarks with JMH, you would typically set up a separate main method or use a build tool plugin that supports JMH. The output might look something like this:

$ java -jar benchmarks.jar
# JMH version: 1.19
# VM version: JDK 1.8.0_151, VM 25.151-b12
# VM invoker: /Library/Java/JavaVirtualMachines/jdk1.8.0_151.jdk/Contents/Home/jre/bin/java
# VM options: <none>
# Warmup: 5 iterations, 10 s each
# Measurement: 5 iterations, 10 s each
# Timeout: 10 min per iteration
# Threads: 1 thread, will synchronize iterations
# Benchmark mode: Average time, time/op
# Benchmark: IntMinBenchmark.benchmarkIntMin

# Run progress: 0.00% complete, ETA 00:00:50
# Fork: 1 of 1
# Warmup Iteration   1: 2.557 ns/op
# Warmup Iteration   2: 2.556 ns/op
# Warmup Iteration   3: 2.556 ns/op
# Warmup Iteration   4: 2.556 ns/op
# Warmup Iteration   5: 2.556 ns/op
Iteration   1: 2.556 ns/op
Iteration   2: 2.556 ns/op
Iteration   3: 2.556 ns/op
Iteration   4: 2.556 ns/op
Iteration   5: 2.556 ns/op

Result "IntMinBenchmark.benchmarkIntMin":
  2.556 ±(99.9%) 0.001 ns/op [Average]
  (min, avg, max) = (2.556, 2.556, 2.556), stdev = 0.001
  CI (99.9%): [2.555, 2.557] (assumes normal distribution)

This output shows the average time taken by the intMin method, measured in nanoseconds per operation.