No space error on 3.21.2

Problem:
Sync to redhat repo failed with “[Errno 28] No space left on device”
Expected outcome:
Expect a job completion
Pulpcore version:
3.21.2
Pulp plugins installed and their versions:
pulp-rpm 3.18.7
Operating system - distribution and version:
Rhel 8.6
Other relevant data:
“error”: {
“traceback”: " File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/tasking/pulpcore_worker.py”, line 452, in _perform_task\n result = func(*args, **kwargs)\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulp_rpm/app/tasks/synchronizing.py”, line 559, in synchronize\n repo_version = dv.create() or repo.latest_version()\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/plugin/stages/declarative_version.py”, line 161, in create\n loop.run_until_complete(pipeline)\n File “/usr/lib64/python3.9/asyncio/base_events.py”, line 642, in run_until_complete\n return future.result()\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/plugin/stages/api.py”, line 225, in create_pipeline\n await asyncio.gather(*futures)\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/plugin/stages/api.py”, line 43, in call\n await self.run()\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/plugin/stages/artifact_stages.py”, line 183, in run\n pb.done += task.result() # download_count\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/plugin/stages/artifact_stages.py”, line 209, in _handle_content_unit\n await asyncio.gather(*downloaders_for_content)\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/plugin/stages/models.py”, line 119, in download\n raise e\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/plugin/stages/models.py”, line 111, in download\n download_result = await downloader.run(extra_data=self.extra_data)\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/download/http.py”, line 273, in run\n return await download_wrapper()\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/backoff/_async.py”, line 151, in retry\n ret = await target(*args, **kwargs)\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/download/http.py”, line 258, in download_wrapper\n return await self._run(extra_data=extra_data)\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulp_rpm/app/downloaders.py”, line 118, in _run\n to_return = await self._handle_response(response)\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/download/http.py”, line 211, in _handle_response\n await self.handle_data(chunk)\n File “/opt/utils/venv/pulp/3.9.7/lib64/python3.9/site-packages/pulpcore/download/base.py”, line 137, in handle_data\n self._writer.write(data)\n File “/usr/lib64/python3.9/tempfile.py”, line 474, in func_wrapper\n return func(*args, **kwargs)\n",
“description”: “[Errno 28] No space left on device”

We are using S3 storage which has plenty of space left. This error happens when we run multiple syncs at the same time. Subsequently we rerun sync with single repo which failed sync earlier and the sync job was successful. The doesn’t look like a storage space related issue. Please advise what could cause this error.

I believe the issue is Pulp is running out of space on the local storage used for the working directory of the sync task. Tasks create temporary directories where they store created/downloaded temporary files before moving them into the actual storage backend. All of these temporary directories are created in the directory from the setting WORKING_DIRECTORY, which defaults to /var/lib/pulp/tmp (DEPLOY_ROOT + “tmp”). Running multiple syncs at one time can fill up this directory quickly especially if they are large syncs with large files.

One option is to increase the local space available for the WORKING_DIRECTORY. You can also decrease the number of concurrent downloads allowed on the remote through download_concurrency so that multiple syncs won’t spawn so many downloaders at the same time. Finally, spacing out the syncs could also help alleviate the storage pressure.

1 Like

seems like it is this issue https://github.com/pulp/pulpcore/issues/1936