local ffi = require("ffi")
ffi.cdef[[
int printf(const char *__restrict, ...);
int scanf(const char *__restrict, ...);
int sscanf(const char *__restrict, const char *__restrict, ...);
]]
--local a = ffi.new("int[1]"); -- This one prints '0' as the value
local a = ffi.new("int64_t[1]"); -- This one works for a 64-bit build
ffi.C.printf("Enter an integer\n");
ffi.C.scanf("%d", a);
ffi.C.printf("Integer that you have entered is %d\n", a[0]);
Give the above cdef, when trying to do a scanf, how 'a' is defined determined
whether it works on not. When it is defined as an int64_t, it will succeed,
and print out the integer value that I typed. When it's defined simply as
'int' as in the commented out case, it will print '0' as the value.
On the other hand,
local pn = ffi.new("int[1]")
ffi.C.sscanf("foo 123", "foo %d", pn)
print(pn[0])
In this case, where I'm doing sscanf in a string, I can simply use 'int' and it
works as expected, printing '123'.
Is this a bug (scanf case) somewhere, or did I do something wrong?
Linux Ubuntu 14.04, luajit 64-bit
===============================
- Shaping clay is easier than digging it out of the ground.