I can't seem to let this idea go of using multiple env maps in layers on a single object. When I was testing that weird brick texture, I originally let the separated envmap channels be colored while applying them, so I could see where one transitioned into the other. To see the transition I needed to switch to blending additive, and that's actually why I stumbled across that for use later. I let the separated 3 env maps be represented by cyan magenta and yellow, which made them very identifiable in game.
Well, if we're talking something like a static placeable you can have an unlimited number of environment maps. But it has to be static (and maybe classification tile?) in order to work. You're still "limited" to one envmap for each mesh in the model because you can only do one texture per mesh and the envmaptexture is called in the texture's .TXI. I've done probably 4-5 envmaps on meshes within the same model that way.
I've since forced my internet speed down when loading Drakensang so I can see how they place all the layers on top of each other to give the great appearance they have. I've also been looking at the spectacular mods for star wars battlefront. There is a screenshot where Vader's helmet is modified by two envmaps. His outer helmet is set up with a standard envmap which shows some lit-room color, but mostly shows light on the right. The face mask env map is set up with some red-room like for photo developing, probably to make it look like he's in front of a control panel on his ship. Most of that red light shows on the left. This is something you can easily duplicate with the multiple envmap layer with additive blending, and I'd like to play with that more.
One of the areas in Drakensang makes use of this a lot with crystal and some magical glows, and I'd really like to bring that over to NWN... just because we can.
I would love to see how some of this stuff plays out if you experiment!
General notes:
* The graphics pipeline changes for the following settings (at least):
-Enable Texture Animations
-Environment Mapping on Creatures
-Visual Effects High Enabled (this might just mess with emitter functions)
-Enable Shiny Water
The full domain of what some of the above do and how (exactly) they do it is a bit of a mystery to me. But I can see the difference in the indiviual frames I grab via GLIntercept. With them turned on, it's pretty common to have a base texture and envmap pair occupying TEX0 and then other maps, usually cube maps, hanging out in TEX1-4 or 5. It may be that while some TXI commands appear not to work, it could be more of a case that they may do something but not when some of the above settings are turned on.
* After reading this fascinating approach on hacking models into submission, I've started hex editing compiled models a bit. Because of the ambiguity in testing presented by the first bullet point about the graphics pipeline, flipping bits or changing pointers from one thing to another haven't produced anything interesting yet. I have hex edited DDS files even less, however...the last bit of information stored in the Bioware header is mostly unknown but someone once posted they believe it was an alphamean setting. Though I don't know how this would differ from a .TXI-based alphamean change, if that's really what the setting is for, it could have some uses. If anyone is interested in hex-editing binary models to see what can be seen, I will produce a brief guide to get them started.
* Reminding myself again that transparencyhint goes up to 5 in the Bioware export scripts.
Edit (11/30/2015):
* To understand this next bit, you have to understand that compiling a model with a supermodel sometimes requires that supermodel to be available to the compiler. A perfect in-a-nutshell description from Brian Chung in "Change Phenotype Lag" in the Omnibus:
"BioWare uses a separate, external compiler for that, and yes, the source MDLs in the supermodel hierarchy need to be in the same folder during compile and play, as they're linked in some manner after being compiled."
If anyone is looking for more specifics, see 0x0068 in the model header. Think "This is probably about memory management and letting the engine know when it can safely unload a model which is only used as a supermodel, during gameplay".
Years ago, I was looking at some raw compiled VFX_*.mdls and found them referencing other VFX files. But not files they were supermodeled into. I remember little except puzzling over the oddity of it.
Fast forward to this last week. After examining a simple animesh ASCII file, I noticed some unusual additonal occurrences of "bitmap". You can probably see this by making a plane, animating a noise (?) modifier with a length of 90 frames and a sample period of 3. Anyway, being the imp I am, I changed each of the bitmap references to something different, compiled the model and checked with a hex editor. The first occurrence of the three occurrences was not found in the compiled model, but the second was. The third happened to be a tga (with a rather convoluted TXI) that was not referenced in the model at all, merely happened to be in the directory with the ASCII model at the time. I bring all this up because I am starting to suspect that what the .MDX is in Knights of the Old Republic is probably the "raw data" area of a compiled model, just split into a separate file. Why a file which was not specifically mentioned in an ASCII model (a texture, no less!) was referenced in a compiled model, in the correct place for a texture, is a mystery. Using processmon to see what files the Bioware model compiler is looking for doesn't yield anything. It just gets a directory listing then does (whatever) under the sheets. I guess what I'm thinking is theremay be some "relationship" between a TGA (or a TGA with a TXI, or whatever) and a model which happens to be compiled with it. This is all with the internal Bioware model compiler, BTW.