OTR was just showing how the filerange works over in his cube mapping discussion. Instead of supplying a single file, you supply X files, tell it a file range, and it collects x files starting with the specified txi filename and ending with that given index -1. So filerange 6 listed in a txi file looks for <filename>0.tga/dds through <filename>5.tga/dds. Using that with the cube mapping, it appears that it queues up the 6 images needed to make the panels of a cube.
I was thinking that if there was any possibility that filerange might be linked to other txi commands, I might be able to make flowing water without having to put all the images in a single file. Or you could make a sign that has letters that change in order.
Or not in order! I was hoping all the emitter functions might be inside the txi system. They seem to be the same in many other aspects, such as blending mode, and the ability to display tiled images or sprite sheets.
However, unlike txi, emitter graphics are at least displayed in the correct order. The trick with using sprite sheets as animated textures via txi is that you want a sprite sheet only a single row high. Additional rows are read in a strange order. It reads first row, then last row, then all rows in between starting with the second to last row. Emitter animated textures read rows from the top down and left to right as expected.
One of the main reasons I asked for this knowledge this morning, was that I was displaying 80, 90, and 250 sprite animations. If I were to import diablo 2 animations, many of those would be 114. That odd number makes bounding them in a multi-row file difficult, since we have to adhere to a strict set of file shapes. 2, 4, 8, 16, 32, 64, 128, 256, 512, 1024, and up.
Even if I were to align all the sprites in a row, I still have to adhere to those numbers in both dimensions. So say I have 128 pixel images, and I lined 80 of them up. That would give me a pixel width of 10240. This is not in that range of powers of 2. This comes with the definite issue of having black space at the end of the strip.
Two work arounds for this issue exist, but they have issues of their own. In either case, you adjust the size of the image, not the image canvas, to one of those power of two numbers. So say I take 10240 up to 16384. That would make my sprites slightly wider, but if I tell the txi that my count of frames per row (numx) is 80, then it will divide 16384 by 80 to determine where to collect my images. Likewise I could shrink my image width down to 8192, losing a little detail. Again numx set to 80 would tell the engine where to cut out graphics for display.
Both of these work arounds have a simple flaw: 16384/80 is 204.8, and 8192/80 is 102.4, both fractions. The engine does not handle fractions well. In fact, it seems to drop the fractional part. But it is worse than that...
This morning I created a 80 frame long strip, as mentioned above. When I run it in game, every frame it cut slightly offset from the next. After 80 frames, or 80*0.8, you are 64 pixels offset from your intended cut point. So the engine not only drops fractions, but it also stores a single number as to what it thinks the offset is. In doing so, it will lose a lot of texture with a strip that wide. 64 pixels is half a frame in my 128 pixel frame scenario.
Since the frames are not intended to be seamless with the adjacent frame, a line develops and shifts across the texture. After the 80th frame plays, it jerks back to cutting from 0,0 and has no offset visible, and no line. This lets people see the loop very well, and makes things generally ugly. At 32 frames per second, that loop is fairly fast. At 8 fps, the animation is sluggish, but it at least takes a long time to loop and jerk.
In conclusion: if I could simply specify the start frame and stop frame, or firstframe lastframe combination, just like in emitter works, I could prevent all that mess.