LuaSTG Pattern Making Tool-Columns

From LuaSTG Wiki
Jump to navigation Jump to search

We have discussed a few programming challenges relevant to making patterns in previous tutorials. We introduced a simple model in which a bullet is spawned in one function call, and the function's parameters can be tweaked to spawn multiple bullets in patterns, and introduced parametrization and tasks as well. For the rest of Part I of this tutorial, I will start introducing some concepts that are specific to the algorithm that this tool uses to generate patterns.


The implementation of almost all patterns involve the use of some loops. Although writing loops in a programming language is flexible and basic (which is good), we aim for better. One thing is, when we script patterns in loops (such as the loop examples in previous tutorials), there are repetitive works done that can be potentially automated. In many cases before the loop execution we have to make a backup of parameters that we are going to modify, have to define a task and adds delay, and have to write increment statements for the parameters E.g.

a loop with one parameter incremented

If the loop nesting goes deep, the code becomes increasingly harder to understand, which makes editing the script more inefficient.


So we want to automate some of the repetitive parts. The first step is to parametrize literally everything in a loop. Here we consider a nested loop with delays, parametrized as shown in the previous tutorial. Since we are working with a nested loop with delays, there will be some parameters that are:

  1. parameters representing initial variables
  2. parameters representing increment values, number of times loop repeats, delays


and further breaking down the first category and we have a modified version of the groupings:

  1. initial values of the incrementable variables
  2. (initial values of) the non-incrementable variables (since they often don't change it's not necessary to say they are "initial")
  3. a reference to the master object
  4. increment values, number of times loop repeats, delays


For a specific set of spawn parameters (like the parameter list image, x, y, a, v we've been using), the number of non-incrementable variables (1 for only image) determines the number of variables in 2) in the above list. The number of increment variables (4 for x, y, a and v) determines the number of initial values in 1) and increment values in 4) in the above. The rest of the variables are not dependent on the spawn parameters and must always be included.


The method to implement this grouping in code is via the use of tables. We add an entry of key-value pair <variable_name, variable_value> for each variable in the above list to a table that it belongs to. We will be using two types of tables, I will call them output columns and parameter columns. These columns are tables that contain the above variables that belong to them. In the above parameter categorization, 1), 2) and 3) belong to all output columns, and 1), 3) and 4) belong to all parameter columns. A few other variables are included, and they will be discussed below.


Output Column[edit | edit source]

An output column represents the complete set of spawn parameters that is needed to spawn a bullet. It is a table with the following fields.

Fields Description
s_t (float) the initial timer value, once this value becomes greater or equal to 0 the spark function will be called on the column itself.
s_master (game object) an object that can hold tasks.
spark (function) a function that spawns a bullet according the column's own fields.
s_image some non-incrementable variables.
x, y, a, v (number) some initial values of incrementable variables.


The spark function is a function spawns a bullet (or any output object). An example would be a piece of code similar to what is covered in our first tutorial:

pseudo code of spawning a bullet with an output column


People unfamiliar with function pointers/references may be confused why a function can be stored in a variable spark. If so, try look up "passing functions as arguments" in google with a programming language you know well.


The objects spawned by the spark function do not have to be bullets. Enemies or other objects can be spawned as well with different sets of spawn parameters.


Parameter Column[edit | edit source]

A parameter column represents a set of parameters that completely parametrizes a loop; in other words, it emulates a loop execution. It is a table with the following fields.

Fields Description
s_n (integer) number of times the loop repeats; if this is 0, the loop will not execute at all.
s_script (array) an ordered sequence of scripts to be executed in each execution of the loop, the scripts should be functions of the form 'f(self, next, i)'.
s_next (array) stores reference to one or more output column and parameter column.
s_t (float) the initial timer value, once this value becomes greater or equal to 0 the spark function will be called on the column itself.
s_dt (float) the waiting time between consecutive loop execution.
s_master (game object) an object that can hold tasks.
spark (function) a function that uses a loop to call the next columns' spark function s_n times.
s_image some non-incrementable variables.
x, y, a, v (number) some initial values of incrementable variables.
d_x, d_y, d_a, d_v (number) specifies increments of the spawn parameters in each execution of the loop.

The variable s_script is perhaps the most crucial element of this table, and I will cover that very soon in the next tutorials. For now though you can ignore this variable and every line of code that references it.


A natural question arises when we divide parameters into groups is that each group can not generate the whole pattern in its own. We need to link them together in some ways, in this case using a field to store links to other columns (can be another parameter column or an output column). This field is s_next. By looking at the first parameter column and all other columns that it links to, we have all the parameters that we need for generating the entire pattern.


The s_next field of a parameter column will point to other parameter columns or output columns. It is defined as an array to allow more flexible linking between the columns, but for simplification I will treat it as a single reference to one column in this tutorial. We will use the term "next column" of a column object C to refer to the column object C.s_next.


The difference between the spark function in an output column and in a parameter column is that, in the former a spark call simply spawns a bullet, while in the latter a spark call will trigger the execution of a loop that calls the next column's spark function many times. Unlike an output column, we do not change the implementation of the spark function of a parameter column, no matter we are spawning bullets or not. It will take the following procedure:

  1. if this is called on a column that has not made a backup of itself, make backups of itself and every column in its s_next field recursively; update the s_next fields of itself and every column reachable from its s_next field.
  2. create a task on s_master that executes the following piece of code.
  3. wait -s_t time
  4. start a loop with i = 0; at each loop, make a copy of s_next with the increments applied
  5. execute s_script in order; for each script in s_scirpt, the parameters (column, s_next, i) are passed over to the script
  6. execute s_next.spark, passing itself (s_next) as parameter to the spark function
  7. increment i by 1
  8. wait s_dt time; repeat the loop if i is less than or equal to s_n


Next I will show a piece of pseudo code of the actual implementation with simplifying modifications.

pseudo code for triggering the for loop


The above function essentially parametrizes a loop using parameters from the table current (and next), meaning that now whenever we need a loop to generate bullets, we can instead assign spawn parameters to a parameter column and call spark on it.


We create a copy of next column object with each loop repeat, and the parameters of the copy are modified. The modified values are calculated from the values of the increments and the current control variable i. Instead of adding increments for each repetition like var = var + incVar, we instead use multiplication to obtain the results like var = baseVar + i * incVar. This avoids modifying the content on the current column i.e. the value of baseVar is unchanged, so the values on the current column is not changed by the execution of the spark function.


To better understand this process, think of the output column as the simplest pattern that shoots one bullet. For the ith parameter column out of all n columns, it will see the (i + 1)th column till the nth column as a "subpattern", where in each repitition it shoots such a subpattern and increment some of parameters before it shoots another one in the next repitition. This is done until the 1st column at which point it will shoot the entire pattern in one call.


The order of the lines of code inside the loop also guarantees that the column scripts are called after the variables on next_copy are incremented.


We need to be aware of some new problem that we face when using those column objects. Column objects need to be created and initialized, the value of each initial parameter has to be set, and the linking between one columns to another needs to be set. All these have to be done manually, and often it's very error-prone to deal with references to many columns. Think about the situation if you want to rename a column object variable from col1 to col2, you have to change all the definition, assignment and linking statements that references it. This results in lots of repeated work being done, even more so than the basic approach of writing loops. What's worse is it is not immediately clear that all patterns can be represented in some combination of these column objects we have defined, and then why would we use these limiting constructs?


There are solutions to these issues, and in the following chapters I will talk about them. Spoiler is that we can automate this process again. The column objects introduced in this tutorial will not be directly used as an end user interface, rather they will be used as building blocks to make patterns, where the definitions, assignments and linking will be automatically managed.


We have discussed two types of column objects: output columns and parameter columns. The output columns can replace the function SpawnBullet and parameter columns can be used in place of loops. With these objects defined we can re-write all the for loop examples in previous tutorials into sequences of column objects that end with an output column, where we can call spark function on the first column object in the sequence to generate a pattern equivalent to what would have happened if we execute the for loop examples.