Loading... Search articles

Search for articles

Sorry, but we couldn't find any matches...

But perhaps we can interest you in one of our more popular articles?
Getting detailed Unity unit test reports with Codemagic!

Getting detailed Unity unit test reports with Codemagic!

Feb 16, 2022

This article is written by Mina Pêcheux.

This post has been updated in July 2022 to introduce some fixes to the codemagic.yaml file in the sample project, and to reflect that you no longer need to contact Codemagic solution engineers to set up Unity machines for you: You start using Unity on Codemagic right away, as it is preinstalled on Mac, Linux and Windows build machines.

TL;DR: Unity does give you the test reports but not in a format that is readable by CI/CD. We write a C# script that transforms the NUnit XML format to JUnit XML that Codemagic can read and use to present you with a report on how your tests went.

A few weeks ago, I wrote an article about unit testing Unity projects in which I discussed why testing is so valuable for game devs and showed how to re-create a basic Pong game with some unit tests. We also took a look at how to automate the testing and deployment of our Unity project in a CI/CD pipeline with Codemagic. :)

If you want to take a look at the code of the sample project, it’s available on GitHub! 🚀

Unit tests are a nice way of ensuring you deliver and deploy valid code and you’re not shipping errors… but what about times when you actually do have some errors? How can you know what needs fixing? Having an overall “passed” or “failed” state isn’t enough for in-depth debugging! You need to know which exact test has failed so that you know where to look.

That’s why when you work with unit tests, you usually put in place what is called test reporting to continuously get the detailed list of test suites and test cases. This allows you to quickly check where the errors come from and why your pipeline failed. And this is what we’re going to discuss in this article.

So, today, let’s see:

  • how to do test reporting during development directly in Unity
  • how to show and download your test results from your Codemagic workflow for checks during the deployment phase!

Learn other reasons why you should use a CI/CD for your Unity projects from this article.

Unity test reporting in the Unity editor

In the previous article, I talked about the official Unity Test Framework package and shared a few code snippets that showcased how to use it for our Pong game. But the nice thing about this tool is that it also comes with built-in reporting inside the Unity editor. :)

Remember the Test Runner window that appeared after we installed the Test Framework package?

Once you’ve created your test suite with an Assembly Reference asset and populated it with some test cases (or even if you’ve just created a default test suite with the tool’s auto-generator), your Unity editor will recompile the assembly and instantly present you with the list of test suites and cases in this window:

Here, you see the usual hierarchy of unit tests:

  • in red: the test suite, the root level at which all of your tests live
  • in orange (optional): the test class, an intermediary level to create logical units in your codebase and better organize the code
  • in yellow: the test cases, the actual “testing functions” that contain your asserts

Note: This hierarchical structure is useful because, most of the time, you’ll get “grouped” statistics at the medium level and therefore have an easy-to-catch bird’s-eye view of your various unit “blocks.” In the case of our Pong game, we’d quickly get an overview of what works and what doesn’t for the Paddle-related functions, Ball-related functions, and Score-related functions:

At that point, you can use the “Run all” or “Run selected” buttons at the top to directly have the tool go through all your test cases… and give you a live report of the results!

The Unity Test Framework will show you the usual test reporting info: the total number of tests, how many passed or failed and which ones, etc. You can even select your tests to see at which line they crashed and which assert failed!

The tool also tells you how long each test took to run and the total duration of the suite and the run.

Unity test reporting in CI/CD

Checking that your tests work inside the Unity editor is sweet, but ultimately, you’ll probably be building and running your project in quite a different environment: Your Unity exports can be for another OS, a mobile platform, WebGL, and so on.

So even though everything was OK in the dev environment, you should also make sure that you properly monitor the production code and that you have enough info on the deployed version.

Just like last time, let’s take advantage of Codemagic to deploy our Unity app from our Git repository, and let’s learn how to easily add test reporting to our Unity apps on Codemagic…

Preparing the test reporter: Converting NUnit to JUnit

If you take a look at the Unity Test Framework’s docs, you’ll see that when you run the tool from the command line, you can pass an additional option to the executable: -testResults. This parameter allows you to specify the path for the test report as an XML file.

For example, for my basic Pong game, Unity will provide me with an XML file that looks like this:

<?xml version="1.0" encoding="utf-8"?>
<test-suite type="TestSuite" id="1019" name="UnitTestingPong" fullname="UnitTestingPong" runstate="Runnable" testcasecount="10" result="Passed" start-time="2022-01-24 20:32:38Z" end-time="2022-01-24 20:32:38Z" duration="0.117678" total="10" passed="10" failed="0" inconclusive="0" skipped="0" asserts="0">
  <properties />
  <test-suite type="Assembly" id="1033" name="EditorTests.dll" fullname="/Users/mina/Documents/ADMIN/WORK/Codemagic/UnitTestingPong/Library/ScriptAssemblies/EditorTests.dll" runstate="Runnable" testcasecount="10" result="Passed" start-time="2022-01-24 20:32:38Z" end-time="2022-01-24 20:32:38Z" duration="0.104961" total="10" passed="10" failed="0" inconclusive="0" skipped="0" asserts="0">
    <properties>
      <property name="_PID" value="54731" />
      <property name="_APPDOMAIN" value="Unity Child Domain" />
      <property name="platform" value="EditMode" />
    </properties>
    <test-suite type="TestSuite" id="1034" name="EditorTests" fullname="EditorTests" runstate="Runnable" testcasecount="10" result="Passed" start-time="2022-01-24 20:32:38Z" end-time="2022-01-24 20:32:38Z" duration="0.104324" total="10" passed="10" failed="0" inconclusive="0" skipped="0" asserts="0">
      <properties />
      <test-suite type="TestFixture" id="1020" name="BallTests" fullname="EditorTests.BallTests" classname="EditorTests.BallTests" runstate="Runnable" testcasecount="1" result="Passed" start-time="2022-01-24 20:32:38Z" end-time="2022-01-24 20:32:38Z" duration="0.037970" total="1" passed="1" failed="0" inconclusive="0" skipped="0" asserts="0">
        <properties />
        <test-case id="1021" name="ShouldInitializeBall" fullname="EditorTests.BallTests.ShouldInitializeBall" methodname="ShouldInitializeBall" classname="EditorTests.BallTests" runstate="Runnable" seed="1302748225" result="Passed" start-time="2022-01-24 20:32:38Z" end-time="2022-01-24 20:32:38Z" duration="0.022394" asserts="0">
          <properties />
        </test-case>
      </test-suite>
      <test-suite type="TestFixture" id="1022" name="PaddleTests" fullname="EditorTests.PaddleTests" classname="EditorTests.PaddleTests" runstate="Runnable" testcasecount="7" result="Passed" start-time="2022-01-24 20:32:38Z" end-time="2022-01-24 20:32:38Z" duration="0.062591" total="7" passed="7" failed="0" inconclusive="0" skipped="0" asserts="0">
        <properties />
        <test-case id="1023" name="ShouldCreatePaddles" fullname="EditorTests.PaddleTests.ShouldCreatePaddles" methodname="ShouldCreatePaddles" classname="EditorTests.PaddleTests" runstate="Runnable" seed="699340261" result="Passed" start-time="2022-01-24 20:32:38Z" end-time="2022-01-24 20:32:38Z" duration="0.009140" asserts="0">
          <properties />
        </test-case>
        <test-case id="1025" name="ShouldInitializePaddles" fullname="EditorTests.PaddleTests.ShouldInitializePaddles" methodname="ShouldInitializePaddles" classname="EditorTests.PaddleTests" runstate="Runnable" seed="1733908646" result="Passed" start-time="2022-01-24 20:32:38Z" end-time="2022-01-24 20:32:38Z" duration="0.000965" asserts="0">
          <properties />
        </test-case>
        <!-- ... -->
    </test-suite>
  </test-suite>
</test-suite>

The problem is that this XML file is not structured properly for common test report prettifiers, such as the one Codemagic uses! By default, the package outputs the results in the NUnit format, whereas most CI/CD tools (like Jenkins or Codemagic, for example) prefer another XML format, the JUnit XML reporting file format:

<?xml version="1.0" encoding="UTF-8"?>
<!-- a description of the JUnit XML format and how Jenkins parses it. See also junit.xsd -->

<!-- if only a single testsuite element is present, the testsuites
     element can be omitted. All attributes are optional.
     Not supported by maven surefire.
 -->
<testsuites disabled="" <!-- total number of disabled tests from all testsuites. -->
            errors=""   <!-- total number of tests with error result from all testsuites. -->
            failures="" <!-- total number of failed tests from all testsuites. -->
            name=""
            tests=""    <!-- total number of tests from all testsuites. Some software may expect to only see the number of successful tests from all testsuites though. -->
            time=""     <!-- time in seconds to execute all test suites. -->
	    >

  <!-- testsuite can appear multiple times, if contained in a testsuites element.
       It can also be the root element. -->
  <testsuite name=""      <!-- Full (class) name of the test for non-aggregated testsuite documents.
                               Class name without the package for aggregated testsuites documents. Required -->
	     tests=""     <!-- The total number of tests in the suite, required. -->
	     disabled=""  <!-- the total number of disabled tests in the suite. optional. not supported by maven surefire. -->
             errors=""    <!-- The total number of tests in the suite that errored. An errored test is one that had an unanticipated problem,
                               for example an unchecked throwable; or a problem with the implementation of the test. optional -->
             failures=""  <!-- The total number of tests in the suite that failed. A failure is a test which the code has explicitly failed
                               by using the mechanisms for that purpose. e.g., via an assertEquals. optional -->
             hostname=""  <!-- Host on which the tests were executed. 'localhost' should be used if the hostname cannot be determined. optional. not supported by maven surefire. -->
	     id=""        <!-- Starts at 0 for the first testsuite and is incremented by 1 for each following testsuite. optional. not supported by maven surefire. -->
	     package=""   <!-- Derived from testsuite/@name in the non-aggregated documents. optional. not supported by maven surefire. -->
	     skipped=""   <!-- The total number of skipped tests. optional -->
	     time=""      <!-- Time taken (in seconds) to execute the tests in the suite. optional -->
	     timestamp="" <!-- when the test was executed in ISO 8601 format (2014-01-21T16:17:18). Timezone may not be specified. optional. not supported by maven surefire. -->
	     >

    <!-- Properties (e.g., environment settings) set during test execution.
         The properties element can appear 0 or once. -->
    <properties>
      <!-- property can appear multiple times. The name and value attributes are required. -->
      <property name="" value=""/>
    </properties>

    <!-- testcase can appear multiple times, see /testsuites/testsuite@tests -->
    <testcase name=""       <!-- Name of the test method, required. -->
	      assertions="" <!-- number of assertions in the test case. optional. not supported by maven surefire. -->
	      classname=""  <!-- Full class name for the class the test method is in. required -->
	      status=""     <!-- optional. not supported by maven surefire. -->
	      time=""       <!-- Time taken (in seconds) to execute the test. optional -->
	      >

      <!-- If the test was not executed or failed, you can specify one of the skipped, error or failure elements. -->

      <!-- skipped can appear 0 or once. optional -->
      <skipped message=""   <!-- message/description string why the test case was skipped. optional -->
	  />

      <!-- error indicates that the test errored.
           An errored test had an unanticipated problem.
           For example an unchecked throwable (exception), crash or a problem with the implementation of the test.
           Contains as a text node relevant data for the error, for example a stack trace. optional -->
      <error message="" <!-- The error message. e.g., if a java exception is thrown, the return value of getMessage() -->
	     type=""    <!-- The type of error that occured. e.g., if a java exception is thrown the full class name of the exception. -->
	     >error description</error>

      <!-- failure indicates that the test failed.
           A failure is a condition which the code has explicitly failed by using the mechanisms for that purpose.
           For example via an assertEquals.
           Contains as a text node relevant data for the failure, e.g., a stack trace. optional -->
      <failure message="" <!-- The message specified in the assert. -->
	       type=""    <!-- The type of the assert. -->
	       >failure description</failure>

      <!-- Data that was written to standard out while the test was executed. optional -->
      <system-out>STDOUT text</system-out>

      <!-- Data that was written to standard error while the test was executed. optional -->
      <system-err>STDERR text</system-err>
    </testcase>

    <!-- Data that was written to standard out while the test suite was executed. optional -->
    <system-out>STDOUT text</system-out>
    <!-- Data that was written to standard error while the test suite was executed. optional -->
    <system-err>STDERR text</system-err>
  </testsuite>
</testsuites>

As you can see, this format uses different tags and attributes than the one the Test Framework creates. This means we can’t rely on the built-in feature to output our test results into a valid XML format.

After searching for a while on the net, I haven’t actually found an XML formatter from the Unity Test Framework ITestAdaptor result C# object to the JUnit XML format… so I decided I would code one myself. ;)

The idea was simply to take the JUnit specifications I linked above and code a little C# function to produce the proper XML document from the tree structure of my Unity Test Framework results. I just had to make sure to skip the intermediary global Unity assembly level because the JUnit XML format doesn’t allow for nested test suites, but other than that, it was mainly about matching the attributes to the right field in the ITestAdaptor object and its sub-children:

using System.Linq;
using System.Xml;
using UnityEditor.TestTools.TestRunner.Api;

public static class Reporter
{

    public static void ReportJUnitXML(string path, ITestResultAdaptor result)
    {
        XmlDocument xmlDoc = new XmlDocument();
        XmlDeclaration xmlDecl = xmlDoc.CreateXmlDeclaration("1.0", "UTF-8", null);
        xmlDoc.AppendChild(xmlDecl);

        XmlNode rootNode = xmlDoc.CreateElement("testsuites");
        XmlAttribute rootAttrTests = xmlDoc.CreateAttribute("tests");
        rootAttrTests.Value = result.PassCount.ToString();
        rootNode.Attributes.Append(rootAttrTests);
        XmlAttribute rootAttrSkipped = xmlDoc.CreateAttribute("disabled");
        rootAttrSkipped.Value = result.SkipCount.ToString();
        rootNode.Attributes.Append(rootAttrSkipped);
        XmlAttribute rootAttrFailed = xmlDoc.CreateAttribute("failures");
        rootAttrFailed.Value = result.FailCount.ToString();
        rootNode.Attributes.Append(rootAttrFailed);
        XmlAttribute rootAttrName = xmlDoc.CreateAttribute("name");
        rootAttrName.Value = result.Name;
        rootNode.Attributes.Append(rootAttrName);
        XmlAttribute rootAttrTime = xmlDoc.CreateAttribute("time");
        rootAttrTime.Value = result.Duration.ToString();
        rootNode.Attributes.Append(rootAttrTime);
        xmlDoc.AppendChild(rootNode);

        foreach (ITestResultAdaptor testSuite in result.Children.First().Children)
        {
            XmlNode testSuiteNode = xmlDoc.CreateElement("testsuite");
            XmlAttribute testSuiteAttrName = xmlDoc.CreateAttribute("name");
            testSuiteAttrName.Value = testSuite.Name;
            testSuiteNode.Attributes.Append(testSuiteAttrName);
            XmlAttribute testSuiteAttrTests = xmlDoc.CreateAttribute("tests");
            testSuiteAttrTests.Value = testSuite.Test.TestCaseCount.ToString();
            testSuiteNode.Attributes.Append(testSuiteAttrTests);
            XmlAttribute testSuiteAttrSkipped = xmlDoc.CreateAttribute("disabled");
            testSuiteAttrSkipped.Value = testSuite.SkipCount.ToString();
            testSuiteNode.Attributes.Append(testSuiteAttrSkipped);
            XmlAttribute testSuiteAttrFailed = xmlDoc.CreateAttribute("failures");
            testSuiteAttrFailed.Value = testSuite.FailCount.ToString();
            testSuiteNode.Attributes.Append(testSuiteAttrFailed);
            XmlAttribute testSuiteAttrTime = xmlDoc.CreateAttribute("time");
            testSuiteAttrTime.Value = testSuite.Duration.ToString();
            testSuiteNode.Attributes.Append(testSuiteAttrTime);
            foreach (ITestResultAdaptor testFixture in testSuite.Children)
            {
                foreach (ITestResultAdaptor testCase in testFixture.Children)
                {
                    XmlNode testCaseNode = xmlDoc.CreateElement("testcase");
                    XmlAttribute testAttrName = xmlDoc.CreateAttribute("name");
                    testAttrName.Value = testCase.Name;
                    testCaseNode.Attributes.Append(testAttrName);
                    XmlAttribute testAttrAssertions = xmlDoc.CreateAttribute("assertions");
                    testAttrAssertions.Value = testCase.AssertCount.ToString();
                    testCaseNode.Attributes.Append(testAttrAssertions);
                    XmlAttribute testAttrTime = xmlDoc.CreateAttribute("time");
                    testAttrTime.Value = testCase.Duration.ToString();
                    testCaseNode.Attributes.Append(testAttrTime);
                    XmlAttribute testAttrStatus = xmlDoc.CreateAttribute("status");
                    testAttrStatus.Value = testCase.TestStatus.ToString();
                    testCaseNode.Attributes.Append(testAttrStatus);
                    testSuiteNode.AppendChild(testCaseNode);
                }
            }
            rootNode.AppendChild(testSuiteNode);
        }

        xmlDoc.Save(path);
    }

}

Note: Keep in mind that this code works well for my simple project, but it is fairly simple and limited, so you might need to adapt and/or extend it if you work with more advanced unit tests._

So, I now have a very basic way of creating an XML document from my test results! I’ll integrate this new function in my Runner script and call it when my test run is finished if the user passed a specific argument to the executable, **-testsOutput**, to specify the path of the XML report:

using UnityEditor;
using UnityEditor.TestTools.TestRunner.Api;
using UnityEngine;

public static class Runner
{
    private static TestRunnerApi _runner = null;

    private class MyCallbacks : ICallbacks
    {

        public void RunStarted(ITestAdaptor testsToRun)
        {}

        public void RunFinished(ITestResultAdaptor result)
        {
            // export test results to JUnit XML format if
            // using -testsOutput argument
            string reportPath = null;
            string[] args = System.Environment.GetCommandLineArgs();
            for (int i = 0; i < args.Length; i++)
            {
                if (args[i] == "-testsOutput")
                {
                    reportPath = args[i + 1];
                    break;
                }
            }

            if (reportPath != null) {
                Reporter.ReportJUnitXML(reportPath, result);
            }

            // clean up and return the proper exit code
            _runner.UnregisterCallbacks(this);
            if (result.ResultState != "Passed")
            {
                Debug.Log("Tests failed :(");
                if (Application.isBatchMode)
                    EditorApplication.Exit(1);
            }
            else
            {
                Debug.Log("Tests passed :)");
                if (Application.isBatchMode)
                    EditorApplication.Exit(0);
            }
        }

        public void TestStarted(ITestAdaptor test)
        {}

        public void TestFinished(ITestResultAdaptor result)
        {}
    }

    public static void RunUnitTests()
    {
        _runner = ScriptableObject.CreateInstance<TestRunnerApi>();
        Filter filter = new Filter()
        {
            testMode = TestMode.EditMode
        };
        _runner.RegisterCallbacks(new MyCallbacks());
        _runner.Execute(new ExecutionSettings(filter));
    }
}

My project is now ready for test reporting inside a Codemagic workflow — so let’s jump to the configuration of our app via its codemagic.yaml file. ;)

Configuring Codemagic for our Unity app

I won’t detail all the steps on how to create a Codemagic app for a Unity project in this post: Be sure to check out the previous article (the “Setting up Codemagic CI/CD” section) for more info on that. :)

Important note: At the time of writing this, deploying Unity apps with Codemagic requires a Pro or Plus Unity license. Previously, you would’ve also needed to contact Codemagic so that we could give you access to the special type of build machines with Unity, but this is no longer necessary, as Unity is preinstalled by default on Mac, Linux, and Windows build machines. However, you can still contact Codemagic to receive a personal demo.

If you’ve followed the steps from the first tutorial or already have a Unity app set up with Codemagic, then you’ll have a configuration file for your Codemagic workflow that looks something like this:

workflows:
  unity-mac-workflow:
      # Building Unity on macOS requires special instance type which is available upon request
      name: Unity Mac Workflow
      environment:
        groups:
          # Add the group environment variables in Codemagic UI (either in Application/Team variables) - https://docs.codemagic.io/variables/environment-variable-groups/
          - unity # <-- (Includes UNITY_SERIAL, UNITY_USERNAME, UNITY_PASSWORD)
        vars:
          UNITY_BIN: ${UNITY_HOME}/Contents/MacOS/Unity
      scripts:
        - name: Activate License
          script: $UNITY_BIN -batchmode -quit -logFile -serial ${UNITY_SERIAL?} -username ${UNITY_USERNAME?} -password ${UNITY_PASSWORD?}
        - name: Run Unit Tests
          script: $UNITY_BIN -batchmode -executeMethod Runner.RunUnitTests -logFile -nographics -projectPath .
        - name: Build
          script: $UNITY_BIN -batchmode -quit -logFile -projectPath . -executeMethod BuildScript.BuildMac -nographics
      artifacts:
        - "mac/UnitTestingPong.app"
      publishing:
        scripts:
          - name: Deactivate License
            script: $UNITY_BIN -batchmode -quit -returnlicense -nographics

This codemagic.yaml file should be placed at the root of your Git repository, and it will tell the Codemagic workflow runner which automation steps to take to validate, build, and deploy your project. It basically sets up the right working environment for the workflow (in particular, it imports or defines the necessary environment variables), then runs the unit tests (and stops immediately if they fail). Then it builds the project for the given target platform, and finally, it deploys the generated artifact.

Now, let’s add just a little bit of configuration to this pipeline so that our Runner script creates a JUnit-compatible XML report (thanks to our brand-new -testsOutput command-line argument) and that the Codemagic UI then displays it in a user-friendly way.

All we need to do is define a new environment variable with the path to our test report file and update the “Run Unit Tests” script block in the middle.

  1. First, we’ll define TEST_REPORT_PATH in the environment section and set it to “tests.xml” — that’s where the test results will be written to using our handmade JUnit XML reporter.
  2. Then, we’ll add the -testsOutput option and use this environment variable to actually produce the report with our Runner script call.
  3. Finally, we’ll use the test_report workflow parameter and pass it this TEST_REPORT_PATH reference so that our results can be pretty-printed on our Codemagic build page.
workflows:
  unity-mac-workflow:
      # Building Unity on macOS requires special instance type which is available upon request
      name: Unity Mac Workflow
      environment:
        groups:
          # Add the group environment variables in Codemagic UI (either in Application/Team variables) - https://docs.codemagic.io/variables/environment-variable-groups/
          - unity # <-- (Includes UNITY_SERIAL, UNITY_USERNAME, UNITY_PASSWORD)
        vars:
          UNITY_BIN: /Applications/Unity/Hub/Editor/2020.3.20f1/Unity.app/Contents/MacOS/Unity
          TEST_REPORT_PATH: tests.xml
      scripts:
        - ...
        - name: Run Unit Tests
          script: $UNITY_BIN -batchmode -executeMethod Runner.RunUnitTests -logFile -nographics -projectPath . -testsOutput ${TEST_REPORT_PATH}
          test_report: ${TEST_REPORT_PATH}
        - ...
      artifacts:
        - "mac/UnitTestingPong.app"
      publishing: ...

And that’s it! Let’s commit these changes and run a new build on Codemagic using this new codebase. After the workflow is finished, we can look at the “Run Unit Tests” step and switch to the “Results” tab to see our test report:

Now Codemagic is reporting on unit tests for our Unity app correctly: The test report is nice and clean, and it gives us a detailed description of the unit tests with some global stats (on the left) as well as each test case’s name and status.

As you can see, I didn’t bother re-creating nodes for the different suites and just grouped everything under the same block. This means that I don’t have any separation between the “Paddle” and “Ball” tests, for example — but we could totally rework the Reporter class to handle this and further improve the display. :)

Bonus: Adding the Unity unit test report as an artifact

To wrap this up, let’s make another small improvement to our Codemagic workflow and make our XML test report an artifact of the pipeline. This can be useful if you want to have a local copy of the data or if you need to share it with your teammates without actually redirecting them to the Codemagic app dashboard.

To make the XML report available as an artifact, we just need to add it to the artifacts block of our codemagic.yaml configuration file:

workflows:
  unity-mac-workflow:
      # Building Unity on macOS requires special instance type which is available upon request
      name: Unity Mac Workflow
      environment: ...
      scripts: ...
      artifacts:
        - "mac/UnitTestingPong.app"
        - ${TEST_REPORT_PATH}
      publishing: ...

Let’s push this in a commit and rerun a new build, then download the artifacts that are packaged as a separate ZIP archive at the end of the workflow:

If you uncompress this archive and take a look inside, you’ll see it contains our “tests.xml” file with all the test report info! :)

Conclusion

To be honest, I wish the Unity Test Framework had some built-in feature for exporting JUnit-XML-compatible test reports because it’s a common format for CI/CD. But all in all, making this handmade JUnit XML reporter was really fun: I found it very interesting to go back to the basics and build it gradually based on the JUnit specs.

As I said, we could obviously improve it to handle more complex test suites and to make better reports of errors… but that’s a topic for another time! ;)

Don’t forget: You can get the sample “unit-tested” Pong project with the test reporting and the codemagic.yaml file on GitHub over here. 🚀


Mina Pêcheux is a freelance full-stack web & game developer. She’s also passionate about computer graphics, music, data science, and other topics! She runs her own blog. Alternatively, you can find her on Twitter.

How did you like this article?

Oops, your feedback wasn't sent

Related articles

Latest articles

Show more posts