I have an application that creates several very small temporary files as part of unit tests or specific functionality.
I started experiencing more often "random" errors OSError: [Errno 27] File too large
. By random I mean the problem sometimes goes away on reboot or if I re-run the tests after some time later. I manually checked that the temp folder on Mac is cleaned up and has enough memory/space to create such small files. (several GBs available) Small files in this context are for example of size 16384, 58330, 26502 (in bytes) or even less. Shutil.copyfile is used to create these files but it also gives the same error when just doing os.link which should take the minimum space on a disk. I replaced the shutil.copfile (where possible) with os.link to test if it fixes the problem but the effect is same.
On Mac OS, this error is often thrown after some random time of intensively running a lot of tests when I do development. However, the error is persistent all the time when running inside the docker image.
Error snippets:
@pytest.fixture(scope="module")
def simulate_mirror(fixtures):
"""
Thjs fixture creates a file/directory structure that simulates an offline PyPI mirror
Used to test the `mirror://` bandersnatch integration for scanning the whole PyPI repository
"""
from aura import mirror as amirror
with tempfile.TemporaryDirectory(prefix="aura_test_mirror_") as mirror:
pmirror = Path(mirror)
assert pmirror.is_dir()
os.mkdir(pmirror / "json")
for pkg, pkg_files in MIRROR_FILES.items():
# copy the package JSON metadata
os.link(
fixtures.path(f"mirror/{pkg}.json"),
pmirror / "json" / pkg
)
for p in pkg_files:
os.makedirs(pmirror / p["path"])
os.link(
fixtures.path(f"mirror/{p['name']}"),
> os.fspath(pmirror / p["path"] / p["name"])
)
E OSError: [Errno 27] File too large: '/analyzer/tests/files/mirror/wheel-0.34.2-py2.py3-none-any.whl' -> '/tmp/aura_test_mirror_a6o5p8fn/packages/8c/23/848298cccf8e40f5bbb59009b32848a4c38f4e7f3364297ab3c3e2e2cd14/wheel-0.34.2-py2.py3-none-any.whl'
tests/conftest.py:204: OSError
def test_apip():
# Test package taken from pip tests
# https://github.com/pypa/pip/tree/master/tests/data/packages
whl = Path(__file__).parent /'files' / 'simplewheel-1.0-py2.py3-none-any.whl'
venv_dir = tempfile.mkdtemp(suffix="_pytest_aura_apip")
# print(f'Virtualenv created in {venv_dir}')
try:
# Create virtualenv
venv.create(
env_dir=venv_dir,
> with_pip=True,
# symlinks=True
)
tests/test_apip.py:56:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/lib/python3.7/venv/__init__.py:390: in create
builder.create(env_dir)
/usr/local/lib/python3.7/venv/__init__.py:66: in create
self.setup_python(context)
/usr/local/lib/python3.7/venv/__init__.py:233: in setup_python
copier(context.executable, path)
/usr/local/lib/python3.7/venv/__init__.py:176: in symlink_or_copy
shutil.copyfile(src, dst)
/usr/local/lib/python3.7/shutil.py:122: in copyfile
copyfileobj(fsrc, fdst)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
fsrc = <_io.BufferedReader name='/usr/local/bin/python'>, fdst = <_io.BufferedWriter name='/tmp/tmpw5gbckwv_pytest_aura_apip/bin/python'>, length = 16384
def copyfileobj(fsrc, fdst, length=16*1024):
"""copy data from file-like object fsrc to file-like object fdst"""
while 1:
buf = fsrc.read(length)
if not buf:
break
> fdst.write(buf)
E OSError: [Errno 27] File too large
/usr/local/lib/python3.7/shutil.py:82: OSError
These errors are also sometimes thrown when creating a virtualenv using venv.create
. I also always receive sqlite3.OperationalError: disk I/O error inside the docker image which might be related to the same problem.
More technical information:
Mac OS Catalina, fully upgraded, reinstalled python via brew to the latest 3.7.7 + recreated all virtualenv and reinstalled all dependencies. Based on other SO questions (File too Large python) I already checked that the file system supports the file size well within the limits and also the maximum number of files allowed in a directory.
Latest commit containing the issue (includes dockerfile that fails with the error):
https://github.com/RootLUG/aura/commit/b4c730693e8f7fd36ab2acc78997694002c4e345
Code locations triggering the error:
https://github.com/RootLUG/aura/blob/dev/tests/conftest.py#L181
https://github.com/RootLUG/aura/blob/dev/tests/test_apip.py#L54
Travis log from the unit tests: