Luajit struct vs table performance.

  • From: Jordan Henderson <jordan.henderson@xxxxxxxxxxxxxxxx>
  • To: <luajit@xxxxxxxxxxxxx>
  • Date: Thu, 3 Apr 2014 23:13:05 +1000

Hi,
I am wondering if the overhead of using a lua table (hash lookups etc) are greater than the overhead of converting lua types -> c types. I know I could benchmark this myself, but I thought it might be easier to post here.
For instance, if I was to store 5 strings as such:
local t = {}
t.a = "asdf"
t.b = "asdf"
t.c = "asdf"
t.d = "asdf"
t.e = 1234

versus a fixed ffi struct:
ffi.cdef[[
typedef struct {
const char* a;
const char* b;
const char* c;
const char* d;
int e;
} t;
]]
... (create the struct/set members etc).

Which one, in theory, is the better option assuming no other members are needed? Would there be any significant speed improvements when creating/using a large amount of these struct objects versus tables + lookup overhead (even with the lua type->c type conversion overhead)? In the end, I am hoping to identify the option with the best performance. I could preallocate an array of these structs beforehand to avoid heavy small allocations as well.

One other thing - I am wondering what the lifetime of struct members are in relation to, persay, lua strings. I assume the raw char* set in an ffi struct member points directly to the memory handled by lua's gc, and therefore is only good as long as the original variable is still in scope?

Thanks ahead of time - these questions have been nagging me for a while now.

Other related posts: