Do most games that use Lua for scripting (addons, UI, possibly game logic) actually use LuaJIT?
As far as I could tell, 100% of the AAA titles that use it that I tested (3 in total) did not. And in profiling, Lua was actually consuming a lot of CPU relatively.
That confuses me given how easy it is to use LuaJIT over the reference, and how vital performance in these environments is.
Games, contrary to the beliefs of dense hiring managers, disconnected producers, and even designers, are performance critical applications -- achieving a mere 30 FPS and > XX+ms input latency on hardware 5 years in the future in the enthusiast category, on the order of $4000+, is an absolute failure. Every gamer I've ever met (thousands, thousands upon thousands, hundreds that I know well) is extremely sensitive to performance. Performance is king. If a game looks a little shitty but runs AMAZING, it is better than a game that looks AMAZING but runs a little shitty. Hell, it's even better than a game that looks WTFAMAZING and runs pretty well. We only start caring about the looks, typically, when performance is sufficient. So if we can dial the game's graphics up while maintaining performance, we get happy. Enthusiasts love it when they can dial all the way to max settings and still play the game at 60+ fps and unnoticeable input lag. So running your render loop and lua on the same thread (a WTF in itself) and not using LuaJIT is a huge WTF.
I think people are gradually switching over. There may be some porting involved, people may have lots of Lua code for older versions. Plus depending what you are doing the JIT may not speed things up (and you can't use it on many consoles); the 2.1 has a lot of string functions accelerated that would break jitting in 2.0. But someone has just paid for a PS4 port, so clearly many people are using it...
Some games use it. I'm under the impression the number is much larger than 3. There are competing interpreters like Havok.
However, vanilla Lua is pretty fast to begin with. Additionally, the types of optimization you do with vanilla Lua are the opposite kind you do with LuaJIT and vice versa. Generally you need to decide early in the project which you want to go with because it impacts how you write code. And it isn't always guaranteed that LuaJIT will be faster. And since game engine cores are written in C/C++, this optimization works against LuaJIT.
Ultimately, games need to stay within their fps budget. If Lua is high on the CPU, but the game is still in budget, nobody really cares. And CPU profiling doesn't always tell the whole story since many things can be GPU bound. It could very well be that the game is already GPU bound so getting more CPU back won't help.
There are other lua implementations (for example HavokVM is lua with very good debugging toolset), there are possibly many others too (commercial, or not, or under other names).
Curios how many people here actually use Lua ? What are the applications you are using it for ? I know it is used a bit in the gaming industry for scripting purposes, but I am not aware of anything else.
We use it internally to map some of the functions in our library to OpenGL.
I've used Lua in the realm of quantitative finance for years. Early on, it was primarily a flexible configuration tool, embedded in a complex event processing system.
With the advent of LuaJIT 2, I've used it standalone for analyzing many many millions of messages per day in real-time, then feeding resultant datasets to other more batteries-included environments which aren't near as fast (e.g. Python, Matlab) or to datastores like Redis and Mysql
I also use OpenResty for web applications (risk management apps, charting, and other visualizations), often pulling from those same datastores. OpenResty is a joy to work with and is crazy fast.
I make heavy use of the FFI both for library bindings (including system calls) and general C structure use. The messages mentioned above are represented as C-structures and are efficiently processed by LuaJIT. I find myself often writing Lua scripts to do networking programs rather than C. Check out `ljsyscall`.
Dealing with memory has definitely been a stumbling point, but once I built enough scaffolding to deal with it (including using jemalloc), it has been clear sailing. Now I store millions of objects taking many GB of memory without significant GC pressure using FFI-based HashMaps and Vectors.
Also understanding how to keep the code in the JIT (versus the interpreter) has taken some artistry -- e.g. examinging the output of -jv and -jdump.
Although this thread is about LuaJIT 2.0.3 release, I use the LuaJIT 2.1 branch which has many enhancements; I especially appreciate the string improvements and trace stitching.
Cool. I found the hash maps and jemalloc binding on Github via your profile. That's something I imagine we will need in Snabb Switch eventually. Now we know where to find it! Thanks.
jemalloc really improved my situation. Before it, I would run a study and then randomly get an out-of-memory error despite having relatively low number/size of GC objects... my FFI objects were snagging space from them. Switching allocations to jemalloc (which LDS makes easy to do) alleviated the problem entirely.
I've been eyeing SnabbSwitch for a while -- I use OpenOnload extensively, so appreciate what you are doing.
I haven't found a need for Lapis, although I think it is an interesting project and enjoy all his work. I use leafo's template engine: https://github.com/leafo/etlua
Definitely for normalization, e.g. take a UTDF message, convert it to some representation (either a C-blob, CSV, or JSON, depending on final use), then push it to a Redis list named utdf.MSFT.
But beyond normalization, there is building books (be they level-1, 2, or 3), maintaining state, and extracting information from that. Random example (i.e. not a real query I've done): emit all the trade executions on any venue that happen within the first two depth levels of NASDAQ.
LuaJIT could do the hard crunching, like regression analysis, quite well, and there are some interesting projects like GSL-shell... but Matlab and numpy/matplotlib are quite expressive and more complete environments.
In case you see this could you tell me if moonscript exposes Lua coroutines through its syntax. The docs dont mention it, and I have not been able to get any information from moonscript author.
I use it, my favorite language by far (after trying about all of them).
>What are the applications you are using it for ?
Game development.
>I know it is used a bit in the gaming industry for scripting purposes
Not just scripting anymore. There are multiple production grade game engines where you write all the code in Lua(JIT). This variant is big in the mobile/casual sector of the industry. Corona [1] seems to be the current market leader there. There is also an open source game engine - LÖVE [2] - where you use Lua(JIT) for everything. However it is more of a hobby/enthusiast thing. It is much simpler, and very few commercial games are based on it. In contrast, Corona is widely used for commercial projects.
I started using lua(jit) with nginx over the past year, with more work recently. The work by agentzh[1] with the lua nginx module and the openresty stuff was very useful. Being able to tweak the behavior of nginx with a full-featured programming language is very useful.
I just finished using it to build realtime stat tracking and a websockets server to display that data to web clients[2][3]. Additionally I've done some work with custom processing in a reverse proxy and handing case-insensitive URLs.
Overall I've found it very comfortable to use and fast, but a little more barebones than something like python. In my use cases, the speed and low overhead has been more than worth it.
I wrote a tool at work to do packet analysis in Wireshark which has a Lua API. I've written a couple of simple games using the Love2D framework. I once wrote an IRC bot using luasocket. I started to write a debugger which was an amalgam of C and Lua, but I've since abandoned that project (not because of Lua, but because I lacked the free time and the patience to support more than one object file format).
I also experimented with using Lua as a target for a Clojure-like Lisp dialect. The compiler is itself written in Lisp and compiles itself to Lua. I've written an Asteroids clone and some toys in Lisp and compiled them into Lua.
It's a very fun language to program in. The reference implementation is also small enough that you can reasonably expect to know its ins and outs in about a week. The C API is really nice; it makes embedding in C a breeze. LuaJIT itself has the best FFI I've ever seen: you can practically #include a C header file in your Lua and it just works.
Please do not confuse Lua with LuaJIT. Whilst LuaJIT claims to be ABI compatible with Lua 5.1, 5.1 is not the latest release of Lua. There are some in the Lua community that consider these two different languages or at least that JIT is a fork.
I'm not following. LuaJIT is 5.1 compatible, with some of the 5.2 features in (compile enabled with LUAJIT_ENABLE_COMPAT52).
Just because it's not the latest release of Lua, it does not make it less Lua language. Yes, bytecode generated by lua reference implementation can't be loaded by luajit (or maybe things have changed since), but that's just like saying Microsoft C++ libs can't be used by MINGW.
You may want to inform Mike Pall that it is 5.1 compatible then. Every time someone on the list says something to the effect that they have just replaced the dll/lib, he informs them that they are doing it wrong and instead they want to do it this way or that way with LuaJIT. The FFI has become such a core feature of LuaJIT now that as soon as you start using it you have already discounted ever using Lua again. So let us be honest, LuaJIT has split from Lua and is now a separate language which Mike says he will not update to support the latest version and keeps adding none compatible features.
I built a web framework & control panel (early stages) using Lua + Nginx (http://github.com/Fizzadar/Luawa & http://github.com/Oxygem/Oxypanel). Although both are pretty basic currently I think Lua has huge potential when it comes to high performance web frameworks.
I occasionally did some game development with it, now I'm writing a toy IRC Bot with hot-pluggable plugins, sandboxing and other fun features: https://code.hackerspace.pl/q3k/mun/ (you can talk to it on #hackerspace-pl on freenode, it's called moodspeak).
One interesting use of Lua is in LaTeX. Recent distributions of LaTeX come with binaries luatex and lualatex. I'm not familiar with the exact details, but if you'll Google around you'll find many references to LaTeX internals being rewritten in Lua.
I've found LuaLaTeX to be quite stable already and used it to compile my PhD thesis.
I think its niche is performance-sensitive and/or memory constrained applications where seamless use of C + scripting language is important. Games is an obvious one, but I'm sure there are others.
I'm learning Lua. No idea what I would use it for, especially given JavaScript's striking similarities. Probably the most interesting app I've found is Luvit, a descendant of Node. Libuv and http-parser are C, so it's not a crazy idea. I like it.
If you know a better way of using async i/o in Lua, I'm all ears.
Well if manually managed memory allocated through the FFI is unacceptable to you for some reason you are out of luck indeed. Well, the limit is per process but I guess distributing the task across multiple processes is not acceptable either.
By the way, it is worth pointing out that Mike Pall (LuaJIT's author) has said that the current Lua garbage collector is not designed to handle such big heaps efficiently anyway.
There are real applications where the GC limitations aren't a problem, but LuaJIT's addressing limitations are.
The one I've run into is using extremely large strings (hundreds of megabytes per string). I've encountered cases where LuaJIT runs out of memory, but standard Lua works fine (and is quite fast).
So while Mike Pall may correct about the behavior of the GC, the implication that this means LuaJIT's addressing limitations aren't a problem in practice is not correct.
Given LuaJIT's limitations (not just this, but also the more fundamental issue of portability), the defacto forking of the language is worrying...
Sure, It's limited to using a maximum of 2GB of memory if I recall correctly, as far as I'm aware it's mainly due to the 32 bit pointers that are used along with the garbage collector.
You really need more than 2G of heap? And can't use ffi? There were some posts to the mailing list that seemed to have a genuine issue, but most use cases do not.
2GB (for 64-bit apps) by itself can be mitigated by using ffi-allocated memory. The problem is really if you write a plugin (OSX) using luajit, and you can't change the main app to not use any memory below 4GB (where the heap should be).
This may be a valid technical concern for certain products (plugins), but not a real limitation for others. And so far it's been mostly on OSX.
As far as I could tell, 100% of the AAA titles that use it that I tested (3 in total) did not. And in profiling, Lua was actually consuming a lot of CPU relatively.
That confuses me given how easy it is to use LuaJIT over the reference, and how vital performance in these environments is.
Games, contrary to the beliefs of dense hiring managers, disconnected producers, and even designers, are performance critical applications -- achieving a mere 30 FPS and > XX+ms input latency on hardware 5 years in the future in the enthusiast category, on the order of $4000+, is an absolute failure. Every gamer I've ever met (thousands, thousands upon thousands, hundreds that I know well) is extremely sensitive to performance. Performance is king. If a game looks a little shitty but runs AMAZING, it is better than a game that looks AMAZING but runs a little shitty. Hell, it's even better than a game that looks WTFAMAZING and runs pretty well. We only start caring about the looks, typically, when performance is sufficient. So if we can dial the game's graphics up while maintaining performance, we get happy. Enthusiasts love it when they can dial all the way to max settings and still play the game at 60+ fps and unnoticeable input lag. So running your render loop and lua on the same thread (a WTF in itself) and not using LuaJIT is a huge WTF.