Hi Thomas,
On Tue, Mar 04, 2025 at 08:10:45AM +0100, Thomas Weißschuh wrote:
The printf format checking in the compiler uses the intmax types from the compiler, not libc. This can lead to compiler errors.
Instead use the types already provided by the compiler.
Example issue with clang 19 for arm64:
nolibc-test.c:30:2: error: format specifies type 'uintmax_t' (aka 'unsigned long') but the argument has type 'uintmax_t' (aka 'unsigned long long') [-Werror,-Wformat]
Signed-off-by: Thomas Weißschuh thomas.weissschuh@linutronix.de
tools/include/nolibc/stdint.h | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/tools/include/nolibc/stdint.h b/tools/include/nolibc/stdint.h index cd79ddd6170e05b19945e66151bcbcf840028d32..b052ad6303c38f09685b645268dad1fa8848370d 100644 --- a/tools/include/nolibc/stdint.h +++ b/tools/include/nolibc/stdint.h @@ -39,8 +39,8 @@ typedef size_t uint_fast32_t; typedef int64_t int_fast64_t; typedef uint64_t uint_fast64_t; -typedef int64_t intmax_t; -typedef uint64_t uintmax_t; +typedef __INTMAX_TYPE__ intmax_t; +typedef __UINTMAX_TYPE__ uintmax_t;
Just thinking loud. While I understand the rationale behind this change, it somewhat contradicts the one on printf where we explicitly use it as an "unsigned long long" that's expected to be 64 bits:
CASE_TEST(uintmax_t); EXPECT_VFPRINTF(20, "18446744073709551615", "%ju", 0xffffffffffffffffULL); break;
Do we really have guarantees that a compiler will always declare it as a 64-bit or unsigned long long ? E.g. we could see new compilers decide that uintmax_t becomes 128-bit. Well, maybe in that case it will simply be a matter of updating the test case after all...
Willy