Go to content Go to navigation

Lua Table Persistence · 2010-01-27 14:56 by Black in

Lua is a very flexible scripting language for embedding into programs. It’s standard API is very slim, it lacks all but basic functions. Adding them is easy though.

The persistence code here requires nothing but lua’s standard io.open for reading and writing files. It can handle loops, multiple references to the same table in both keys and values, and most standard value types.
Not supported are userdata, threads and many types of functions. Exporting simple lua functions works, but the exported byte code is not portable. The result from the export is itself lua code, it can be executed and returns data structures equivalent to those that were exported.

The core for the export is a simple recursion with a dispatcher method and writers for all types. When unsupported types are encountered, nil is written. This can cause problems on import when those unsupported values are used as table keys, but in most cases it is more desirable than to fail the export.

persistence.lua [5.50 kB]

  1. -- Format items for the purpose of restoring
  2. writers = {
  3.   ["nil"] = function (file, item)
  4.       file:write("nil");
  5.     end;
  6.   ["number"] = function (file, item)
  7.       file:write(tostring(item));
  8.     end;
  9.   ["string"] = function (file, item)
  10.       file:write(string.format("%q", item));
  11.     end;
  12.   ["boolean"] = function (file, item)
  13.       if item then
  14.         file:write("true");
  15.       else
  16.         file:write("false");
  17.       end
  18.     end;
  19.   ["table"] = function (file, item, level, objRefNames)
  20.       local refIdx = objRefNames[item];
  21.       if refIdx then
  22.         -- Table with multiple references
  23.         file:write("multiRefObjects["..refIdx.."]");
  24.       else
  25.         -- Single use table
  26.         file:write("{\n");
  27.         for k, v in pairs(item) do
  28.           writeIndent(file, level+1);
  29.           file:write("[");
  30.           write(file, k, level+1, objRefNames);
  31.           file:write("] = ");
  32.           write(file, v, level+1, objRefNames);
  33.           file:write(";\n");
  34.         end
  35.         writeIndent(file, level);
  36.         file:write("}");
  37.       end;
  38.     end;
  39.   ["function"] = function (file, item)
  40.       -- Does only work for "normal" functions, not those
  41.       -- with upvalues or c functions
  42.       local dInfo = debug.getinfo(item, "uS");
  43.       if dInfo.nups > 0 then
  44.         file:write("nil --[[functions with upvalue not supported]]");
  45.       elseif dInfo.what ~= "Lua" then
  46.         file:write("nil --[[non-lua function not supported]]");
  47.       else
  48.         local r, s = pcall(string.dump,item);
  49.         if r then
  50.           file:write(string.format("loadstring(%q)", s));
  51.         else
  52.           file:write("nil --[[function could not be dumped]]");
  53.         end
  54.       end
  55.     end;
  56.   ["thread"] = function (file, item)
  57.       file:write("nil --[[thread]]\n");
  58.     end;
  59.   ["userdata"] = function (file, item)
  60.       file:write("nil --[[userdata]]\n");
  61.     end;
  62. }

To be able to export tables that are referenced several times (be it a cycle in the data structure, or just one that is inserted several times), the structures that are to be written are examined first and the numbers or references to each table are counted.

All tables that have multiple references to them are created at the start in the export file before they are filled with content. This is required, since they could contain themselves or other multi-ref tables.

After all those temporary tables are created, they are filled with content. The writer for tables uses a lookup table for multi-ref tables, instead of creating the table constructor for them, they are assigned from the table created at the start. Last but not least, the passed arguments themselves are created in the same way.

persistence.lua [5.50 kB]

  1.   store = function (path, ...)
  2.     local file, e;
  3.     if type(path) == "string" then
  4.       -- Path, open a file
  5.       file, e = io.open(path, "w");
  6.       if not file then
  7.         return error(e);
  8.       end
  9.     else
  10.       -- Just treat it as file
  11.       file = path;
  12.     end
  13.     local n = select("#", ...);
  14.     -- Count references
  15.     local objRefCount = {}; -- Stores reference that will be exported
  16.     for i = 1, n do
  17.       refCount(objRefCount, (select(i,...)));
  18.     end;
  19.     -- Export Objects with more than one ref and assign name
  20.     -- First, create empty tables for each
  21.     local objRefNames = {};
  22.     local objRefIdx = 0;
  23.     file:write("-- Persistent Data\n");
  24.     file:write("local multiRefObjects = {\n");
  25.     for obj, count in pairs(objRefCount) do
  26.       if count > 1 then
  27.         objRefIdx = objRefIdx + 1;
  28.         objRefNames[obj] = objRefIdx;
  29.         file:write("{};"); -- table objRefIdx
  30.       end;
  31.     end;
  32.     file:write("\n} -- multiRefObjects\n");
  33.     -- Then fill them (this requires all empty multiRefObjects to exist)
  34.     for obj, idx in pairs(objRefNames) do
  35.       for k, v in pairs(obj) do
  36.         file:write("multiRefObjects["..idx.."][");
  37.         write(file, k, 0, objRefNames);
  38.         file:write("] = ");
  39.         write(file, v, 0, objRefNames);
  40.         file:write(";\n");
  41.       end;
  42.     end;
  43.     -- Create the remaining objects
  44.     for i = 1, n do
  45.       file:write("local ".."obj"..i.." = ");
  46.       write(file, (select(i,...)), 0, objRefNames);
  47.       file:write("\n");
  48.     end
  49.     -- Return them
  50.     if n > 0 then
  51.       file:write("return obj1");
  52.       for i = 2, n do
  53.         file:write(" ,obj"..i);
  54.       end;
  55.       file:write("\n");
  56.     else
  57.       file:write("return\n");
  58.     end;
  59.     file:close();
  60.   end;

Loading the exported data is simple, but the provided method performs some error checking.

persistence.lua [5.50 kB]

  1.   load = function (path)
  2.     local f, e = loadfile(path);
  3.     if f then
  4.       return f();
  5.     else
  6.       return nil, e;
  7.     end;
  8.   end;

I hope this code is useful for someone, use it as you wish, it is licensed under the MIT license.

Comment [1]

Source Code Management with Git · 2009-12-29 15:22 by Black in

Git is a distributed SCM designed by Linus Torvalds to manage the development of the Linux Kernel. Since it’s licensed under the GPL, it can be used freely by anyone.

Just like backups are a necessity for anyone who uses a Computer (or should be…), source code management is a necessity for serious developers. Not only does it track the past state of the project (which allows tracking the introduction of bugs), but it also allows the management of separate branches. That way, development can continue to add new experimental features while production uses only stable and tested code.

Git is a distributed SCM tool, unlike CVS and Subversion it does not require a central server and by design there is no central authoritative repository. Every repository contains the full history. Every file is hashed and added to a database. Every commit contains a tree of file hashes, a commit message and a pointer to the ancestor commits. All that is hashed and added to the database, so a commit’s hash can be used to cryptographically verify the integrity of the complete previous history. For a more technical perspective on git’s inner workings, read Git for Computer Scientists (It really is quite cool in it’s simplicity). Here’s a one sided comparison of git with some alternatives.

I have started to use git beginning of 2008 for my work on ExaminationRoom, and while the start was a bit hairy, having a history of my code development as well as my comments have helped me a lot, even as only developer. I worked on three computers, so keeping the code synchronized was critical. That too was easy thanks to the SCM, even without a reachable central server (One of the computers had no internet access, it was only used to drive two Projectors for the experiments.)

I still use git these days, and can’t recommend it more. Although most other projects are World of Warcraft addons… All my public code can be cloned from my repositories


Screen Space transformation in OpenGL · 2009-12-13 16:10 by Black in

Transforming from World Space to Screen Space in OpenGl is useful for selecting and picking with a mouse or similar. OpenGL itself provides the function gluProject and gluUnProject to do this. This class replicates that functionality. Download the full header and source for a commented version. Below, a simplified header shows the API.


  1. class ScreenProject
  2. {
  3. public: // ScreenSpace
  4.   void calculateMVP(GLint * vp, double * mv, double * p);
  5.   void calculateMVP();
  6.   Point transformToScreenSpace(Point _p, float _f = 1) const;
  7.   Point transformToClipSpace(Point _p, float _f = 1) const;
  8.   Point transformToWorldSpace(Point _p) const;
  9. private:
  10.   double mvp_[16];  /**< Product of Modelview and Projection Matrix */
  11.   double mvpInv_[16]; /**< Inverse of mvp_ */
  12.   long vp_[4];    /**< Viewport */
  13. }

Even if the class is initialized, it can not be used until one of calculateMVP is used to pass custom matrixes or read the matrixes from the current OpenGL context. After this is done, the matrix and inverse is stored internally, and the calculations work without accessing any outside data.

The various transform functions transform the passed point with the internal state. Going from screen space to world space is very useful in picking, selecting or generally interacting with the scene with a pointing device. Transforming to clip or screen space is used when calculating anchor points for labels on screen.

The Point type is a three-element vector. OpenGL works with homogenous coordinates, so a fourth value is needed. Because this is almost always 1, that was chosen as default value. There is one important exception: When transforming normals, the last coordinate is zero. Normals are not influenced by translations.

The source file contains code for the transformation with the matrixes, viewport transformation and matrix inversion. The code for the inversion was taken from Mesa, everything else is from me. OpenGL’s matrixes are column-major so the four numbers in the first column are mapped to the first four slots in the 16 slot array. The transformation matrix is built from the modelview matrix M and the projection matrix P with P*M, a point p is then projected as in P*M*p.

screenproject.cpp [6.51 kB]

  1. Point ScreenProject::transformToClipSpace(Point _p, float _f) const
  2. {
  3.   Point pT = transformPointWithMatrix(_p, mvp_, _f);
  4.   return Point((pT[0] + 1)/2, (pT[1] + 1)/2, (pT[2] + 1)/2);
  5. }
  7. Point ScreenProject::transformToWorldSpace(Point _p) const
  8. {
  9.   // Transform to normalized coordinates
  10.   _p[0] = (_p[0] - vp_[0]) * 2 / vp_[2] - 1.0f;
  11.   _p[1] = (_p[1] - vp_[1]) * 2 / vp_[3] - 1.0f;
  12.   _p[2] = 2 * _p[2] - 1.0;
  14.   // Transform
  15.   return transformPointWithMatrix(_p, mvpInv_);
  16. }
  18. Point ScreenProject::transformPointWithMatrix(Point _p, const double * _m, float _f) const
  19. {
  20.     float xp = _m[0] * _p[0] + _m[4] * _p[1] + _m[8] * _p[2] + _f * _m[12];
  21.     float yp = _m[1] * _p[0] + _m[5] * _p[1] + _m[9] * _p[2] + _f * _m[13];
  22.     float zp = _m[2] * _p[0] + _m[6] * _p[1] + _m[10] * _p[2] + _f * _m[14];
  23.     float tp = _m[3] * _p[0] + _m[7] * _p[1] + _m[11] * _p[2] + _f * _m[15];
  24.     if (tp == 0)
  25.     return Point(xp, yp, zp);
  26.     else
  27.     return Point(xp / tp, yp / tp, zp / tp);
  28. }


Simple vector class in C++ · 2009-12-12 14:44 by Black in

Mathematical vectors are often used in C++, but no classes for it exist in the std, the standard library. C++‘s Vector classes are heavy weight containers that offer a rich function set but are unsuitable for mathematics.

For ExaminationRoom I created a set of small classes to be able to easily pass vectors around in my code and perform simple operations on them. The implementation makes use of templates and operator overloading to offer an easy interface for users and still keep it flexible. It is not intended as a competitor to Boost’s uBLAS classes or similar rich mathematics libraries.

vec.h [7.47 kB]

  1. /**
  2. A small helper object, that is a 2 element vector. It can be treated as point
  3. (with x, y accessors) or an array (with operator[] accessor).
  4. */
  5. template <typename T>
  6. union Vec2
  7. {
  8.   enum { dim = 2 };
  10.   struct
  11.   {
  12.     T x;
  13.     T y;
  14.   };
  16.   T vec[dim];
  18.   Vec2() {x = y = 0; };
  19.   Vec2(const Vec2<T>& v)
  20.   {
  21.     x = v.x;
  22.     y = v.y;
  23.   };
  24.   Vec2(T a, T b)
  25.   {
  26.     x = a;
  27.     y = b;
  28.   };

The code above is the start of the declaration of the Vec2 type. There are three base types, Vec2, Vec3 and Vec4, which are intentionally incompatible. Conversion with a creator is only possible when no data is lost. An interesting detail is that the class is not declared as normal class but as union, all members are at the same location in memory. That way, vector values can be accessed by their names or their location.

vec.h [7.47 kB]

  1. template <typename T>
  2. inline Vec2<T> & operator/=(Vec2<T> &v1, const T s1)
  3. {
  4.   v1.x /= s1;
  5.   v1.y /= s1;
  6.   return v1;
  7. }
  9. template <typename T>
  10. inline const Vec2<T> operator+(const Vec2<T> &v1, const Vec2<T> &v2)
  11. {
  12.   Vec2<T> v = v1;
  13.   return v += v2;
  14. }

The operators are defined globally as inline functions since they are rather simple. When ever possible, the definition of an operator is built up on a previously defined one to minimize code duplication.
Some special methods were defined for normalization of the vectors, conversion to homogenous vectors as well as cross products of 3-element vectors. Often used types are defined with usable names. In ExaminationRoom, those were the types I used:

vec.h [7.47 kB]

  1. typedef Vec3<float> Vec3f;
  2. typedef Vec4<float> Vec4f;
  3. typedef Vec3f Point;
  4. typedef Vec3f Vector;
  5. typedef Vec3f Color3;
  6. typedef Vec4f Color4;

Lua Interface

An implementation of marshaling to Lua in the form of a simple table with luabridge was also written:

lua interfacing

  1. template <typename V>
  2. inline void pushVector(lua_State *L, V v)
  3. {
  4.   const int n = V::dim;
  5.   lua_createtable(L, n, 0);
  6.   for (int i = 0; i < n; i++)
  7.   {
  8.     lua_pushnumber(L, i+1);
  9.     lua_pushnumber(L, v[i]);
  10.     lua_settable(L, -3);
  11.   }
  12. }
  13. template <typename V>
  14. inline V toVector(lua_State *L, int idx)
  15. {
  16.   const int n = V::dim;
  17.   V v;
  18.   luaL_checktype(L, idx, LUA_TTABLE);
  19.   for (int i = 0; i < n; i++)
  20.   {
  21.     lua_pushnumber(L, i+1);
  22.     lua_gettable(L, idx);
  23.     v[i] = lua_tonumber(L, -1);
  24.     lua_pop(L, 1);
  25.   }
  26.   return v;
  27. }
  28. template <>
  29. struct tdstack <Tool::Vec2f>
  30. {
  31.   static void push (lua_State *L, const Tool::Vec2f & data)
  32.   {
  33.     pushVector<Tool::Vec2f>(L, data);
  34.   }
  35.   static Tool::Vec2f get (lua_State *L, int index)
  36.   {
  37.     return toVector<Tool::Vec2f>(L, index);
  38.   }
  39. };

From C++ to lua, a table is created and all elements are put into the table in order. This requires the type of the vector to be compatible with Lua’s number representation, which is usually a floating point number. Back converts tables that contain numbers back to a vector type of the suitable size. Missing or wrong table contents lead to lua errors that get caught with lua_pcall, all other table contents get ignored. (Here the reason for the existance of the enum “dim” is seen: A type variable that can be evaluated at compile time, which unlike static const variables do not take up space therefore can be defined and declared in a header.)


Mayan with GLSL · 2009-12-09 00:00 by Black in

The first test implementation of Mayan was a Photoshop file containing the picture in various states of desaturation and blending. The second implementation was a direct show filter for the group’s stereo movie player. The third and latest implementation is an OpenGL Shading Language shader for ExaminationRoom.

ExaminationRoom was extended to support shader assisted merging of the two viewpoints. This was done by rendering both the left and the right camera’s view to FramebufferObjects, which then get drawn while the given Fragment Shader is active. The shader can calculate how to modify each sides’ fragments. The blend func is GL_ONE during this time, so no further modification is performed.

mayan.fs [526.00 B]

  1. uniform sampler2D tex;
  2. uniform float side;
  4. // Factor that determines how much of the other
  5. // colors is mixed into the primary channel of that
  6. // side. This is the same lambda as in the mayan paper.
  7. uniform float lambda;
  9. void main()
  10. {
  11.   float facR = 1.0-side;
  12.   float facG = side;
  13.   float mixFactor = (1.0-lambda)*0.5;
  15.   vec4 c = texture2D(tex, gl_TexCoord[0].xy);
  16.   gl_FragColor = vec4(
  17.     facR*(c.r*lambda + (c.g+c.b)*mixFactor), // Red
  18.     facG*(c.g*lambda + (c.r+c.b)*mixFactor), // Green
  19.     c.b*0.5, // Blue
  20.     0.5); // Alpha
  21. }

Fragment shaders get a uniform variable that defines which side the currently drawn texture is on. Lambda is a factor that influences the desaturation of the colors for better 3D impression.

Using shaders for mixing allows for maximal adaptability with hardware accelerated speed. Unlike the original Anaglyph renderer it can mix different colors and is able to handle shared channels like Mayan’s blue.


ExaminationRoom · 2009-12-08 13:05 by Black in

As previously mentioned, ExaminationRoom is the result of my masters thesis. From the project page:

Viewing stereoscopic movies or images is unnatural. The focus and vergence of the eyes have to be decoupled. Artefacts and inconsistencies of a stereoscopic image with the real world cause confusion and decrease viewing pleasure.

ExaminationRoom is a Tool that enables exploration of those problems and quantifying them by providing a flexible and extensible framework for user testing. Challenges include understanding the needs in this relatively new field of research, as well as the commonly used methods in user testing.

The project began with a simple program that generated random dot stereograms from a 1 bit depth image. While this code was rewritten completely later on, it still proved that the general idea of the test worked.

The real ExaminationRoom design started out as a bunch of boxes on a notepad. It was fairly simple: A scene graph containing the objects that are displayed, a scripting core that executes user provided code to move the scene, and a rendering engine that renders the the scene graph.

After some searching I decided to build my own scene code. The preexisting libraries had too many limitations when it came to simulate depth cues. The script core was a Lua state that acted directly on the scene graph. During my WoW Addon writing career I got to like this language, it’s simplicity make it easy to learn and integrate into other applications.

The whole application had to run on both Mac OS X and Windows. Qt was the most comfortable way to achieve this goal, it abstracted many platform dependent features such as window and input handling. Easy handling of pictures for textures was an added benefit.

The rendering of the scene is OpenGL based. Each object in the scene graph can draw itself into the scene, containers can modify the state before and after drawing their contents to achieve interesting effects. The rendering of the stereoscopic representation is controlled by a group of classes titled Renderer, which are responsible for mixing the views of left and right cameras appropriately.

ExaminationRoom Screenshot

The screenshot shows a simple scene with custom depth ordering drawn with the line interlacing renderer.
Read more on this topic in my thesis, but be warned: It’s long! :)