Using M1 Mac Mini’s to power our GitHub Actions iOS CI

TL;DR:

Using the command

We were able to run XCUITests on Mac Silicon machines controlled by GitHub Actions self-hosted runners.

  • arch -x86_64 boots GitHub’s self-hosted runner software.
  • arch -arm64 checks if you’re on an M1 mac and also escapes emulation inheritance.
  • We found both arm64 and arm64e architecture of xcodebuild on our machines.

Full story

Screenplay’s continuous integration (CI) tests take about 120 minutes of Mac compute to test our software against 10+ major open-source apps. I wanted to add compacity to the CI worker cluster, so I ambitiously purchased two ARM Mac Minis.

I used Migration Assistant to duplicate an existing Intel Mac Mini in about 30 minutes. Smooth sailing so far.

The first hitch came when trying to boot the GitHub Actions self-hosted runner. The client that GitHub distributes currently only included an x86_64 architecture.

No problem — we’ll use Rosetta to boot the runner through emulation:

The CI worker process came to life, and the M1 machine began accepting jobs. Upon attempting to build and test a battery of open source apps, I noticed they all failed.

Running xcodebuild test ... creates two products — an iOS application and a UITest-Runner application. Using the file command, I could see that the UITest-Runner product included only an x86_64 architecture. While Rosetta ensures that all child processes, including xcodebuild and the simulator, inherit the intel emulation, the iOS simulator on Mac Silicon machines appeared only to support ARM binaries.

I tried a quick hack and adjusted a build setting in the test target of one of the apps:

While this build setting enabled the test runner to boot in the simulator, the test execution itself began to timeout. A quick check revealed that the apps and their dependent frameworks were still all x86. I knew that they can be built for ARM — release configurations require it — but I would need to find a way for xcode to make the right choice.

Because almost all xcode projects set the build settingArchitectures to ARCHS_STANDARD, the final simulator app binary would contain ARM or x86 depending on which architecture xcode itself was running.

If I could escape the x86 emulation inherited from the GitHub parent process, xcodebuild could run in ARM, leading to an ARM artifact and a successful simulator run.

How does one escape an x86 emulation? Running man on the arch command explained that Rosetta would only accept a limited number of arches:

Luckily for us, this documentation appears incomplete. It turns out that arch -arm64 and arch -arm64e are both valid and undocumented commands.

Our CI runs on both Intel and ARM Mac Minis, so I needed to detect the environment correctly. Prepending arch -arm64e on an Intel would fail because Intel Macs are unable to emulate ARM. Leveraging this failure, we run arch -arm64e echo hello and check the exit code. That, combined with knowledge from uname -m would allow:

ARMed with this function, we were almost out of the woods. Running Screenplay builds using arch -arm64 xcodebuild ... seemed to be reliably working.

The last strange hurdle was a difference in behavior. When Screenplay’s software was executed from within an existing Xcode project in the form of a run-script build phase, it would pass. When executed from outside an xcode project, it would fail. Why?

Upon closer inspection, our M1 Mac Minis had two xcodebuild binaries with different architectures:

xcodebuild triggered from the command line contained an arm64e architecture, requiring arch -arm64e xcodebuild... while the same command called from within xcode required arch -arm64 xcodebuild... . Subtle and painful.

Knowing this difference, the final fix was simple. Always reference the full xcodebuild path and ensure a matching arch command.

And with that, our CI successfully passed on both our Intel and M1 machines. Moving forwards, we’re now able to stack on faster, cheaper mac compute without worrying about compatibility. Likewise, we can keep our Intel machines running as a nice assurance of cross compatibility.

(For what its worth, we consistently see about a 2x speed increase building and testing large open-source apps on the Mac Silicon Minis compared to Intel machines.)

PS.

There are still a handful of open source apps we are forced to pin to our Intel machines. Apps like Wikipedia have a hard coded Carthage workaround that excludeds ARM architectures from their binaries: https://github.com/wikimedia/wikipedia-ios/blob/main/scripts/carthage_update#L28

I expect this to be a temporary workaround that will be removed in a few months.

If you want to to learn more about the novel iOS engineering we’re doing at Screenplay, check out https://screenplay.dev

Interested in helping us build the first iOS rollback tool? https://jobs.lever.co/screenplay

Screenplay Cofounder, Former Airbnb Software Engineer, Harvard CS ‘17