I'm messing around with Lua trying to create my own "scripting language".
It's actually just a string that is translated to Lua code, then executed through the use of loadstring. I'm having a problem with my string patterns. When you branch (for example, defining a variable inside of a variable declaration) it errors. For example, the following code would error:
local code = [[ define x as private: function() define y as private: 5; end; ]] --defining y inside of another variable declaration, causes error
This is happening because the pattern to declare a variable first looks for the keyword 'define', and captures everything until a semicolon is found. Therefore, x would be defined as:
function() define y as private: 5 --found a semicolon, set x to capture
I guess my question is, is it possible to ignore semicolons until the correct one is reached? Here is my code so far:
local lang = { ["define(.-)as(.-):(.-);"] = function(m1, m2, m3) return ( m2 == "private" and " local " .. m1 .. " = " .. m3 .. " " or " " .. m1 .. " = " .. m3 .. " " ) end, } function translate(code) for pattern, replace in pairs(lang) do code = code:gsub(pattern, replace) end return code end local code = [[ define y as private: function() define x as private: 10; end; ]] loadstring(translate(code:gsub("%s*", "")))() --remove the spaces from code, translate it to Lua code through the 'translate' function, then execute it with loadstring
You're absolutely crazy for trying to do this at all.
Try to have it only capture as far as the colon when define
is caught, incrementing a 'depth' variable, and parsing the remaining code. When it runs into another define
or anything else that has a code block ending with a semicolon, catch the "required" portion of that block (i.e. up to the colon in your define
), incrementing the 'depth', and parsing the remaining code again.
When it runs into a semicolon at all, decrement the depth value, and return
.
Again, you're absolutely crazy for trying this.