[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[PATCH] Handle memory issues with large files
From: |
Samanta Navarro |
Subject: |
[PATCH] Handle memory issues with large files |
Date: |
Sun, 9 Jan 2022 12:00:35 +0000 |
If an input file contains a line which is too large for heap, then it
can happen that no error is shown but the file is not further processed.
Such a condition should lead to error return value instead of being
silently treated as success.
Proof of Concept:
$ dd if=/dev/zero bs=1024 count=2048 | tr '\0' '#' > acl.txt
$ echo "invalid line" >> acl.txt
$ ulimit -d 1024
$ setfacl --restore=acl.txt
$ echo $?
0
The return value of setfacl should not be zero (success).
---
The error message states that "line 0" is not okay. Maybe this should
be adjusted as well.
---
tools/parse.c | 9 +++++++--
1 file changed, 7 insertions(+), 2 deletions(-)
diff --git a/tools/parse.c b/tools/parse.c
index f052400..0803812 100644
--- a/tools/parse.c
+++ b/tools/parse.c
@@ -447,8 +447,11 @@ read_acl_comments(
(*lineno)++;
line = __acl_next_line(file);
- if (line == NULL)
- break;
+ if (line == NULL) {
+ if (feof(file))
+ break;
+ goto fail;
+ }
comments_read = 1;
@@ -580,6 +583,8 @@ read_acl_seq(
if (ferror(file))
goto fail;
+ if (!feof(file) && line == NULL)
+ return -1;
return 0;
fail:
--
2.34.1
[Prev in Thread] |
Current Thread |
[Next in Thread] |
- [PATCH] Handle memory issues with large files,
Samanta Navarro <=