Skip to content

lost log files under high load (using .gz) #3648

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
dasvex opened this issue May 5, 2025 · 0 comments
Open

lost log files under high load (using .gz) #3648

dasvex opened this issue May 5, 2025 · 0 comments

Comments

@dasvex
Copy link

dasvex commented May 5, 2025

Description

In our production, entries in log files are lost under high load.
It is possible that this is due to the archiving process.

It seems that if the file archiving of one file is not completed and the second one is already starting to archive, they can overwrite each other.

Configuration

Version: [2.24.3]

Operating system: [Oracle Linux Server 8.4; win10]

JDK: [21]

Reproduction

Most likely, the problem will reproduce well on weak servers, where the file archiving process takes some considerable time. For example, on servers with a slow HDD or with a slow CPU.

Simple test to reproduce here:
https://github.com/dasvex/log4j-lost-files-on-gz/tree/main

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Status: To triage
Development

No branches or pull requests

1 participant