[cfarm-users] gcc104: rosetta and disk space concerns

Zach van Rijn me at zv.io
Mon Oct 24 15:51:10 CEST 2022


On Tue, 2022-10-18 at 10:00 -0500, Zach van Rijn via cfarm-users
wrote:
> ...
> 
> Rosetta is installed now.
> 
> gcc104 (homebrew):~ zv$ gcc -arch x86_64 foo.c -o foo
> gcc104 (homebrew):~ zv$ file foo
> foo: Mach-O 64-bit executable x86_64
> gcc104 (homebrew):~ zv$ ./foo
> hello!


When an x86_64 binary is executed, it results in the creation of
an image at /System/Volumes/Data/private/var/db/oah/. This is the
ahead-of-time translation artifact cache [0], and will fill the
disk if left undisturbed (meaning I manually delete files in it).

One user of this system runs automated tests with a low system
load, which is normally a perfect use of the farm, however these
unit/regression tests seem to create ~50GB of artifacts per day.

Right now this means there are 80k files and 80k directories (one
file per directory) at 67GB, since I last cleared it 2 days ago.

Suppose two people did this, maybe the disk fills up even faster.

The immediate solution is to delete all cached artifacts, since
the translation is performed again, transparently, if needed.

I am looking for a more robust solution than this [1], to ensure
that the disk never fills up with these artifacts. Running a cron
job is not ideal since the rate of disk usage is not constant
w.r.t. time. I could check every N hours if X storage is used,
and then to delete the oldest files. Hoping for a better way.

Any ideas?

As a side note, online documentation suggests this can only be
deleted when SIP is disabled (as it is now); would users of Macs
with SIP enabled experience the same issue and be unable to fix
it without temporarily disabling SIP? That would seem like a
major oversight to me. Automated testing is not an "edge case".


ZV

[0]: 
https://support.apple.com/guide/security/rosetta-2-on-a-mac-with-apple-silicon-secebb113be1/web

[1]: 
https://apple.stackexchange.com/questions/427695/how-can-i-limit-the-disk-usage-of-the-rosetta-2-oah-cache



More information about the cfarm-users mailing list