The stdio fgetc returns the character read as an unsigned char cast to
an int.
The reason why it gets casted from unsigned char to int is because EOF
is defined as a negative value (usually -1).
At the moment in the atomicio.test we store the int in a char.
However the C standard states that the sign of a char is implementation
specific. This makes the test non portable: an architecture/ABI which
which is considering a char as a unsigned char won't compile since a
unsigned value will always be != -1 (EOF).
This is the error message you would get on a aarch64 host /w gcc/5.4.0
build/ARM/base/atomicio.test.cc:121:48:
error: comparison is always true due to limited range of data type
[-Werror=type-limits]
Change-Id: I120e44b5204d98e643f19b8dd6fa2762342a6e64
Signed-off-by: Giacomo Travaglini <giacomo.travaglini@arm.com>
Reviewed-on: https://gem5-review.googlesource.com/c/public/gem5/+/25384
Reviewed-by: Ciro Santilli <ciro.santilli@arm.com>
Reviewed-by: Jason Lowe-Power <jason@lowepower.com>
Tested-by: kokoro <noreply+kokoro@google.com>
EXPECT_EQ(file_contents.size(), size);
- char c;
+ int c;
for (unsigned int i = 0; (c = fgetc(file)) != EOF; i++) {
- EXPECT_EQ(file_contents[i], c);
+ EXPECT_EQ(file_contents[i], (unsigned char)c);
}
fclose(file);