Author Topic: Forum, Downloads, and Info: CEP & Project Q  (Read 2218 times)

Legacy_Nissa_Red

  • Full Member
  • ***
  • Posts: 121
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #45 on: March 08, 2013, 01:42:24 am »


               It's probably not very reasonable to write about such a complex topic that late in the day (err, night, yes ?) for me, so you'll have to forgive me for any shortsights on my part >.<

Anyway, here goes : first, we would need to agree that we're talking about a "public" release of a merger. While it's perfectly possible to merge both packages in a way to fit the situational needs of a PW or a module builder, by customizing it and doing compromises, a "public" release entails that each and every *unique* resource of both packages remains avalaible through the merger. That's how I see it at least.

On top of my head, I can think of two types of resources that are extremely problematic : clothes (or rather bodyparts, to be accurate) and tilesets. Bodyparts don't come with a 2DAs that allow us to shift the models around like we want. Clothes, robes, helms, cloaks, even heads, wings and tails, would need to be renumbered. While renumbering would be possible for some of these resources, some others are limited to 255 (clothes and heads, for example).

Next, tilesets, even more problematic. CEP has one version of the "same" tileset (tin01, for example), Project Q another. Will such tilesets get duplicated, which wouldn't have much value to a builder and might cause issues with the maximum number of tilesets avalaible in a module sooner or later, or will they be merged too ?

Even creatures will show problematic, since the supermodelled ones can share animations, some phenotypes will come with robes and others not, etc.

Finally, what about duplicates ? Let's ignore for a moment the difficulty to identify true duplicates (I could elaborate on that if needed, but at a later time). Let's imagine the CEP has a version of the Balor, and Project Q another. Which version should we pick ? If we pick both versions, we might face content bloat again, which isn't helpful at all to a builder (at least, to me, it isn't).

So we might decide that one package systematically overrides the same versions of the other package. However, what if we prefer a version from the first package for one model, and a version from the second package for another model ?

I'm not saying a "public" merger without any heavy compromises is totally impossible (even if my little nose, err builder's intuition, tells me that it comes very close to it). What I'm saying that it's not a worthwhile endeavor given :

1/ how much time is required as initial investment
2/ that all the builders will never agree with each other on the compromises that need to be done
3/ that no one will accept to reliably maintain such a merger throughout time, making it effectively pointless to builders with each new release of the two packages.

There you have it. I don't mean to discourage anyone to undergo such a challenge, please prove me wrong by all means ^.^ I also certainly do NOT want to discourage anyone to use either of these fantastic packages, or both of them at the same time, by tweaking them themselves, but I think we should all keep in our mind if it's really worth it in the end to have it ALL.

We have a local saying here where I live : sometimes, "better" is the enemy of "good".
               
               

               


                     Modifié par Nissa_Red, 08 mars 2013 - 01:56 .
                     
                  


            

Legacy_painofdungeoneternal

  • Sr. Member
  • ****
  • Posts: 313
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #46 on: March 08, 2013, 03:20:50 am »


               Sounds to me like a good test. Your description is quite helpful, very similar issues as I am dealing with in NWN2 -- and yes i understand it's skipping a lot of details - it's nice to have an idea of what i am getting into from someone with experience.

I don't think it's possible to do it perfect, the goal would make it so a deranged person who decides to use everything, would be able to use it for his PW or other project, and really stress the issues in my system to the point where it's easy to spot them for me. Really doubt it would ever assemble in the same way twice, and each mashup would need a custom set of instructions on how to deal with required choices, and anything built with it would need to recreate it as it was.

My preference though is that such a tool would not be for merging Q and CEP, but rather it could be used to make bits of Q, and bits of CEP, and a lot of other bits, can all be downloaded and merged dynamically in more granular pieces to create custom packages. Eventually you'd select the individual creatures, clothes, tilesets, scripts, whatever you want to use, and the compilation would be built as needed. Ideally compilations would be less sets of content, and more instructions as to what content is needed, and any post processing, and injection needed.

Perhaps make it so certain things are reserved/standardized, so a desert using Q or a desert using CEP would tend to use the same slot/row, a balor tends to use a certain row in appearance.2da, and other similar things tend to be marked as versions of the same thing which are interchangeable. This is a future idea, but thinking if you use little content, and most people have a demon on row 9999, then as you add lots of content or swap areas with others, its less likely to be a problem if I can make demons magnetically attracted to a given row unless it's already in use.
               
               

               
            

Legacy_Nissa_Red

  • Full Member
  • ***
  • Posts: 121
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #47 on: March 09, 2013, 02:06:57 am »


               

painofdungeoneternal wrote...
My preference though is that such a tool would not be for merging Q and CEP, but rather it could be used to make bits of Q, and bits of CEP, and a lot of other bits, can all be downloaded and merged dynamically in more granular pieces to create custom packages. Eventually you'd select the individual creatures, clothes, tilesets, scripts, whatever you want to use, and the compilation would be built as needed. Ideally compilations would be less sets of content, and more instructions as to what content is needed, and any post processing, and injection needed.

Perhaps make it so certain things are reserved/standardized, so a desert using Q or a desert using CEP would tend to use the same slot/row, a balor tends to use a certain row in appearance.2da, and other similar things tend to be marked as versions of the same thing which are interchangeable. This is a future idea, but thinking if you use little content, and most people have a demon on row 9999, then as you add lots of content or swap areas with others, its less likely to be a problem if I can make demons magnetically attracted to a given row unless it's already in use.


You appear to have a very reasonable take on the question.

As a builder, and if I correctly interpreted what you said, your tool would have an immense value to someone that tries to patch together different hakpacks on a "project" level, rather than a "public" level. While I am not eager to support any kind of "mega-ultra" merger project, like those we've seen in the past and which were all doomed to fail eventually, I'm eager to provide you with whatever personal experience I have acquired during my own toolset "adventures", however modest it might be, if you found it useful.

At its core, I consider a NWN resource as "an appearance", or rather as "an appearance + associated functionalities", because sometimes stuff like WOKmeshes or animations are not immediately visible. As I see it, the issue with building from a huge pool of NWN resources is not only to manage to make them all work together, but also to identify and catalog them (to avoid content "bloat"). That has always been THE true issue to me, and I'll spare you all the lamenting of me finding yet another instance of Lisa's clothes in yet another cloth hakpack, for example ^.^ Since Lisa's cloth models were mostly near perfect to start with (in the way of a toolset resource, anyway), no one ever retouched them. Therefore, there is no added value to find them in separate hakpacks over and over again.

It would be a huge time saver to me if some kind of tool could somehow identify these models when inspecting/importing a hakpack, and associate them with a catalogued base model, if relevant. But how could we identify models of a common pool, with plenty of potential duplicates, through a unique way ?

Well, unfortunately, I haven't found any easy way. The model name, size, time of creation or MD5/SHA are not reliable enough. The way I still do it is through visual inspection of the "appearance", and then model file, to detect additional "functionalites", like updates, corrections, or added value (animations, phenotypes, icons, etc).

So what I'd consider as next best option would be to be able to import hakpacks as a whole, visually inspect models (with an interface like NWNexplorer), catalog models once and for all in a local database with a proper key if I identified them as unique, so that extra documentation could be added. This extra documentation could be automatically generated, for at least a good part of it, from the 2DA or directly from the model/texture files. If it can't be generated (the setting or style, for example : oriental, modern, medieval, etc.), it can be added by the user. It would also help pinpointing models with issues. For example :

  • this bodypart model has a PLT texture with C1/C2/L2/M1 channels
  • that creature comes with a reflection map
  • that tile lacks the proper lighting nodes
  • that placeable lacks the proper use nodes

I should then be able to export the model (and any associated textures or models, like phenotypes, reflection maps, doors or tileset groups) "on the fly" with whatever toolset identifiers ("001", "002", "tcn01", "race", etc.) fitting my needs, along with the properly constructed ".2DA"/".set" files. There is my hakpack.

The next time I find interesting resources on the Vault, or elsewhere, I will import it, compare it to existing/catalogued resources, and eliminate duplicates. Since I would only ever be keeping unique models, new stuff would play nice with the old from the get go. It just would take some efforts to get it all catalogued at the beginning.

Basically, it would be a "Set editor" (updated here), expanded for any kind of NWN resources, but with a database.

Why a database ? Because builders operate with entities at the "object" level, rarely if ever at the sub-level. If I want to merge clothes in a hakpack, it should come with everything needed to find "clothes" in the toolset, without the risk of forgetting anything or wrong human manipulation/operations (for example, "pmh0_chest38.mdl", an existing BW model, which should have been "pmh0_chest038.mdl"). The same for tilesets, creatures (which have footstep sounds, sizes, races, etc.) or placeables (which have soundsets, portraits, etc.).

Why a local database ? I've thought of an online database, but there is the lingering question of who will maintain it, pay for the upkeep costs, etc. If some common repository could be agreed upon however, the database could be fed and monitored through community effort, and the advantages would be immediate :

  • like you suggested, cc artists could reserve "lines" (that models are "magnetically" lured to) and would get proper credits as original source when their model is used somewhere
  • their models could be fixed or enhanced throughout time and ALL the builders would benefit from them
  • since almost any resource would be uniquely identified (except recently released stuff), the risk of duplicates would be significantly lowered

Working hakpacks could be generated "on the fly" from anywhere, for almost any kind of project (PW/SP modules). Local databases would be a fallback solution, or perhaps the first step towards a more centralized option.

I think this post is probably long enough as it is though, and I'm perhaps too optimistic and overlooking stuff, so before I get further carried away, I'll stop for now ^.^. Thank you for undertaking this project. If (and I really hope it will) it takes off, you'll have solved about 95% of the issues the community ever had with custom content.

Sending you my encouragements, and kind regards.
               
               

               


                     Modifié par Nissa_Red, 09 mars 2013 - 02:11 .
                     
                  


            

Legacy_painofdungeoneternal

  • Sr. Member
  • ****
  • Posts: 313
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #48 on: March 09, 2013, 04:52:50 am »


               I see the duplicate issue as key to what i am doing. CRC and the like is the first line of defense, but frankly my aim is to just inventory all the current content, and set it up so it can be annotated by anyone, using an open API ( which can be accessed by websites and applications. ) What you are describing will not be done up front, but rather crowdsourced on an ongoing basis. However, getting the use nodes, level of details in the models, and related 2da information up front will be part of the initial processing.

It needs manual control to work well, and making this easy is key. But all the work is going to be labeling the data, with preview images eventually added, and the data fields being changed to support the latest ideas of the community. How the data is used, i think should be open to the end users, hence the data won't be in a database, but in a open web service type API. The data is going to correlate to the vault and the nexus, and be maintained as these update so it's always in sync. If you have a file on your system, with a crc of suchandsuch, my api could identify all the original sources for this content. Likewise, it creates a system to allow the community to access overall data on what is in the vault, what is duplicated, what is unique, and further to add information, commentary and annotations to that data.

In addition this is in association to the Vault Preservation Project, which has much of the vault content safe on hard drives, but does not have it usable. Rolo Kipp invested in 2 years of hosting ( and he really could use contributions and support, most of us just don't have a lot of cash to get this going. ). The idea here is we cannot just reupload the vault willy nilly, but if the vault dies, and we have a backup we throw up which has the same content, by some folks associated with the vault ( hence sidestepping the lack of permission ). I am not backing up the vault, but inventorying the content of the vault, with names, dates, crc's, and even preview images, and including information I am adding on where this content should be installed to, and plan this to support the VPP.

My primary purpose at first is the installation of packages, either as a whole or in part. Installation will be driven by manifest files, which describe the original content, but also allowing dependencies, which will download from the vault, nexus and private sites. The player hits play on a module, I see the crc of the module and the modules name, can use this to look up the data related to this module in the API and thus get it's dependencies, and know it needs certain override content to run right, and where that overide content needs to be downloaded from.

But the way I am doing this, is designed to allow others access to the same API ( similar to how people use the facebook api to make games, or the google maps api to show location information, or skywings api to make new ways of hopping into NWN ).  Others will come up with things I have not thought of, and marking content which relates to other similar content, or which is part of a given set, or which uploads/downloads content easier, or which manipulates the data in new ways.

What you are describing, can easily be done with my API, probably by me, or by anyone else who has a hankering and can't wait until i get to it. Since i already note the crc of unique content, and the dates and names, some of the work will just be data processing. Despite it being a different flow, it seems entirely possible to do what you describe to the raw data, and really this seems to me to be annotating of what is out there.

Further, manifests I see as being used to patch content of others, thus allowing community fixes to old content ( or even official content such as adding things to the OC modules ). The old legacy content is going to take a long time to sort out, but upgrades, and new content, as it's done can be tracked while it's done, especially if my app is used to sync versions on a authors system with the vault and nexus. A user can just request all the content that fixes a given package ( which is marked at a given quality level ), the query can be done in the API, and the end user gets the latest and greatest.

Similarly, if I see a new version of a piece of content, i keep the old and new crc's for the pieces, thus being able to track the old version, and using matching names to know a piece is an improved version. This should allow me to correlate the various crc's and names, and identify whenever that older crc is present, that it likely can be replaced with the new version. Again this is largely just data processing, once the raw data is stored this is just leg work.

The key i think to doing what you describe, is to have a table of packages, which might be on either the nexus or the vault, or both.

Then have a table of files as seen in the file system, haks, erfs and other loose files - these are the containers.

Deeper in have a table of loose files, but also the contents of erfs, haks and such - these are the actual files. A indivual 2da row almost belongs at this level as well.

Finally to come back to what you brought up, to have a separate table listing items - placeables - clothes - tilesets, ie the smallest pieces people actually deal with. IE a model comprising it's animation, it's texture, 2da adjustments, etc and a picture so it's easy to identify it. But further have these list what is variations on the same piece. And give certain people authority in the community to help organize these, specify rows and indexes this content prefers to use, and which is the most ideal version of the specific content, whether it's greek or other tags, and who the actual author is. ( further when installed the ability to easily renumber, but make it so it tends to use certain rows to increase compatibility )

Reserving rows is a big issue, and after the data previously described is setup, well this likely will build on that.

( the tlk file issues I think i have much better handle on, and hope to have a system which indexes all the used tlk entries on the vault, and the text involved, as well as the various translations and official tlk entries. My hope is here to crowdsource translations and get it so even minor content gets a translation, perhaps even get the game translated into entirely new languages, but kind of leery of getting into this at the moment as this is going to be hard to get set up right away until quite a few other things are completed. )

Again thanks for your input, going to be reviewing your comments more in depth as I develop this. Still going to focus on the CEP and Project Q, using the system I described, and seeing how much i can get both of them working together ( pulling this off topic detour back into it's original topic ). This is going to be how some people use my tool initially. ( need to see how Q does the rsync as well, just so i use the latest versions of their content, never tried recreating rsyncing yet )

( done editing finally )
               
               

               


                     Modifié par painofdungeoneternal, 09 mars 2013 - 08:52 .
                     
                  


            

Legacy_Nissa_Red

  • Full Member
  • ***
  • Posts: 121
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #49 on: March 09, 2013, 06:42:02 am »


               I will let you finish before answering.

Mostly however, I agree 99% with you, and very enthusiastic about your project/goals.

Please take your time, still sipping morning coffee here.
               
               

               
            

Legacy_Nissa_Red

  • Full Member
  • ***
  • Posts: 121
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #50 on: March 09, 2013, 10:33:46 am »


               First, thank you for your valuable feedback, and time. I think I understand better now what your goal and priorities are.

In essence, your tool is the manager of a huge ("virtual"/"distributed?", I want to avoid using too much lingo) hakpack, without the limitations of a hakpack.

1- It would take as input :
a/ a collection of hakpacks, downloaded from the Vault or elsewhere, like today.
b/ a list of "identifiers" associated to unique content, regularly updated online as new content is released
c/ module/user dependant data (customized 2DA, renaming or renumbering schemes, etc.)

2- It would deliver as output :
a/ one or several hakpacks

For players, everything is transparent (and simply downloaded online through your tool). Builders will have to intervene at 1-c/, to provide the data (manifests) for hakpacks and modules that already exist, or that will be avalaible tomorrow, and record them in a community-moderated online "database" (like a wiki).

It comes as a batch/API, that can be expanded upon.

Hopefully, I got that right. Please don't hesitate to correct me if I'm wrong.

Of course, despite some slight skepticism that remains in me about how you intend to proceed at point 1-b/ to obtain the identifiers in a fail-proof way (and I have ideas to submit about alternatives or complements if you were interested, but that'll come in another message, to keep this one short), I believe you know what you're doing, and I can see the immediate value of such a tool for our community as a whole.

Of course, you saw right through my (now with some hindsight, naively stated) "expectations" in my previous message, and it's all to your credit that you tried to get me back on a more reasonable track in a very diplomatical fashion ^.^

I would, indeed, love to convince you (or maybe someone else, but one has to take the chance when the opportunity arises, yes ?) to add to your tool :

3- It would allow the user to add personal documentation to identifiers of 1-b/

The point of my suggestion would not be to duplicate downloaded content in a local database. The database (SQLite should suffice) would only ever contain documentation that will always remain subjective, like categorization, or that is simply not worth being centralized, like annotations or links to local documents (.pdf, .doc, .xls). The purpose of such an evolution would be to enable builders to easily construct the manifests needed at point 1-c/, much more easily than any current tool allows. This in turn would allow us to feed your tool with it, and obtain customized hakpacks within minutes.

It's my feeling that we're basically swimming in custom content and can't take full advantage of it, because we can't properly categorize, catalog and ultimately manipulate it. It's a pretty frustrating one, hence why my plea, but I'm the first to realize the burden that you put yourself on your shoulders with such a project, and  would hate to compromise it with something that doesn't do anything to alleviate it >.<
               
               

               


                     Modifié par Nissa_Red, 09 mars 2013 - 10:36 .
                     
                  


            

Legacy_Rolo Kipp

  • Hero Member
  • *****
  • Posts: 4349
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #51 on: March 09, 2013, 01:40:13 pm »


               <listening...>

@ Nissa: Just a quick note, the synergy between so many projects under active development excites the blazes out of me :-)

One of the initiatives I've started (that is waiting breathlessly for the Never Launcher API) is the Custom Content Conservancy (which is one of several projects under the VPP).

The intention of the CCConservancy is to catalog - with proper attribution and screenshots and (if I can interest someone who has already demonstrated a java3d model viewer into figuring out a web embedded version... ;-) even a model previewer. It will recursively spider through all the stored content, find identical files and attempt to determine who the initial author was. This would be the root of a resource tree. Improvements would be versions up the trunk, variations would be branches (and might - in Launcher - be offered as choices to players (depending on builder settings, I'd imagine).

Eventually, I want every texture, every model, every sound indexed, cataloged, commented, and available, complete with listings of dependencies (these file use this model, this model uses these files, this model can be found in these archives).

This dialogue is simply wonderful, and I really hope you continue to give input into all these eggs I'm brooding over =)

<...ovidly>
               
               

               
            

Legacy_AndarianTD

  • Hero Member
  • *****
  • Posts: 725
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #52 on: March 09, 2013, 02:34:34 pm »


               

HeavenStar wrote...

Is it possible to build a mod with both CEP 2.4 and Project Q 1.5? I saw in Project Q document that they may not be compatible. Is there a way to work around that?


Not only is it possible, it's been done. See my recently released remake of Sanctum of the Archmage 1: The Sight, which integrates CEP 2.4, Q 1.5 and other haks with some module specific content as well.

While I haven't pulled out the haks as a separate release to the community, anyone who wants to develop a CEP/Q integrated module is welcome to use my merge 2DAs. Just download the main Sanctum archive and pull out the Sanctum_QCEP_top.hak. That should give you most of what you need.

(A side warning: There's no easy way I know of to handle the conflicting CEP/Q blueprints as part of the hak system without painstakingly re-making them and adding them to it. The Sanctum/Q/CEP merge is designed for Q content to take precedence over CEP content in those cases, and I resolve conflicts as I need them for my work in the module file. You may have to do something similar as well.)

Andarian
               
               

               


                     Modifié par AndarianTD, 09 mars 2013 - 02:37 .
                     
                  


            

Legacy_painofdungeoneternal

  • Sr. Member
  • ****
  • Posts: 313
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #53 on: March 09, 2013, 02:39:21 pm »


               

Nissa_Red wrote...

I like that description, what i am doing is very hard to get across, as though it's copying many existing things, it's actually a completely new animal which includes features which already exist, but designed to work together. ( and how you described seems apt and yet beyond how I've tried thinking of it yet )

Nissa_Red wrote...

3- It would allow the user to add personal documentation to identifiers of 1-b/

The point of my suggestion would not be to duplicate downloaded content in a local database. The database (SQLite should suffice) would only ever contain documentation that will always remain subjective, like categorization, or that is simply not worth being centralized, like annotations or links to local documents (.pdf, .doc, .xls). The purpose of such an evolution would be to enable builders to easily construct the manifests needed at point 1-c/, much more easily than any current tool allows. This in turn would allow us to feed your tool with it, and obtain customized hakpacks within minutes.



The term is open API. Lets say i refuse to listen to you, and that you have the chops to do it, or just get frustrated ( which is why i ended up doing this, probably not the best person to do it, but no one else listens ).

You can make a program, which matches up what I've done, and it's unique identifiers, and uses those as keys in your local program. Your local program keeps notes and additional information as desired, and you can replicate my installer if you so wish. ( how it works will be documented, and the file formats are all documented and available as source code classes for various languages )

This idea is the same as what i am doing on the vault, since i am indexing the vault content, and adding it to my api, i can correlate the ip on the vault of a PW, with skywings gamespy api ( past or present ip, or perhaps name ) and allow worlds to be listed by vault rating, thus letting players get a means of choosing a PW besides player count.

Of course I do listen, and plan on adjusting, just wanted to underline what open means, and that when i am done it won't just be up to me.

( Manifests would be constructed not manually, but based on what is installed or in use, NWN2 for example builds a xml file describing the download requirements each time you save a module inside the module, simple things would be auto created. I will have manifests, and data inside the API, entirely done via batch processing initially, but the language I am using allows for conditions, and custom commands inside these manifests - ie instead of set a row on a 2da, the language lets you append a row and i cannot automate this easily ( for some feats being in use by a class )

The results would be similar to mac ports in how it works, but by indexing things and knowing the folder they were in, and the file type, it's possible to automate it.
               
               

               


                     Modifié par painofdungeoneternal, 09 mars 2013 - 02:49 .
                     
                  


            

Legacy_OldTimeRadio

  • Hero Member
  • *****
  • Posts: 2307
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #54 on: March 09, 2013, 02:53:17 pm »


               

Rolo Kipp wrote...
...and (if I can interest someone who has already demonstrated a java3d model viewer into figuring out a web embedded version... ;-) even a model previewer.

See, this is a great idea but...Well, I tried this sort of thing out two different ways in the past and IMO, it's one of those "better on paper than in practice" ideas, IMO.  I came at it from both VRML and using Acrobat 3D, FWIW.  What happens in both cases is you wind up going through a lot of bandwidth, in the form of streamed data or filesize, just to duplicate content (models, textures) that you're not really getting all that much of a payoff for duplicating. 

IMO, as pedestrian as it sounds, a good 3/4 isometric screenshot made with something like NWN Explorer Reborn with the appropriate "Initial viewing angle YPR" settings (in options) is going to net you something almost as good as a full 3D model with just a fraction of the bandwidth/processing power.  And that work can also be crowdsourced much more easily, to boot.
               
               

               
            

Legacy_Tarot Redhand

  • Hero Member
  • *****
  • Posts: 4165
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #55 on: March 09, 2013, 03:56:31 pm »


               Building on what OTR just said, take a number of shots in sequence of the same object at differnet angles and combine them into an animated gif for the best of both worlds.

TR
               
               

               
            

Legacy_henesua

  • Hero Member
  • *****
  • Posts: 6519
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #56 on: March 09, 2013, 04:40:03 pm »


               I agree with the screenshot.

I could write a web based NWN explorer for a database of resources, and think it is possible to do so without running into serious bandwidth problems BUT I've got other fish to fry so am definietly not signing up to write such a thing at this time.

Why not just start with posting screenshots and if I or anyone else ever gets around to writing a web based NWN Explorer you can plug that aspect in.
               
               

               
            

Legacy_Nissa_Red

  • Full Member
  • ***
  • Posts: 121
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #57 on: March 10, 2013, 02:17:01 am »


               

Rolo Kipp wrote...
@ Nissa: Just a quick note, the synergy between so many projects under active development excites the blazes out of me :-)


You and Painofdungeoneternal (and everyone else working on the project that I'm not aware of too, of course) have my gratitude for taking such a project at heart. I'm really happy you do.

I'll admit I didn't know about CCConservancy till very recently. I thought that barring a complete copy-over to the Nexus or maybe a static "emergency" backup of the Vault stored somewhere, we'd never have as favorable a situation as the one we're currently experiencing in matters of custom content available to ones of my fav' games, NWN. Now, and thanks to Painofdungeoneternal's thorough explanations, I think I have a better and more optimistic perception of how things could evolve, and I am really starting to like the "CCCs" : they tend to bring us good things around these parts ^.^

As a module builder, I try to make mine the following preoccupations :

- making the best out of what we've been so generously provided with by people that shared the fruit of their talent and work with us, to preserve it (not at the level CCConservancy may do, of course, just for myself or friends I play with) and to further enhance it if at all possible (through fixes, but also through a more pertinent use in modules)
- being able to reliably identify content, to be aware of where it comes from, not only for the sake of integrity (avoiding duplicates, or benefitting from enhancements that might have already been done by the community), but also to acknowledge the original authors, so that I can show them the respect and gratitude that they deserve

For me, these concerns go hand in hand. Achieving the first isn't possible without the second, and working on the second would have no point without the first in sight. So, to do this, my current workflow looks something like this :

1/ I download content from the Vault (or elsewhere), and archive it on my hard drive in a way that I can trace back to it easily (URLs for example)
2/ I gather similar content by type and then categorize it through Excel/OO sheets
3/ I inspect the content more closely, either through NWNexplorer or MDLviewer, and document it
4/ I attempt to identify potential duplicates or synergies
5/ I take advantage of this time to also gauge the quality of the content : are any fixes needed, or further customizations desirable/possible ?
6/ I then take decisions as to what kind of content I intend to keep, and what I need or want to remove
7/ I use a batch to extract/reorder the content that I've previously identified
8/ I construct the 2DA/Set files needed for the extracted content to play well together
9/ I generate the final hakpack(s) out of 6/ and 7/

At this point, I should finally be able to start building, with near optimal knowledge of the content in the hakpacks of my module, of its sources (so that I can give proper credits), of its weaknesses (what kind of fixes I still need to make, or to be on the lookout for), qualities and potential (how I further could customize it to bring my "personal" touch).

All of this I can already do as of today, but since I've read Painofdungeoneternal's messages, I wonder if I couldn't do it better, meaning faster and especially more reliably.

5/ and 6/ will never turn into automatic processes. They're my "added value" as builder.

1/ as automatic process is of no direct value to me as builder (but would be as player!).

7/, 8/ and 9/ are already more less automatic for me, and while I don't doubt that it could be further improved, I believe that the tool will adequately handle these parts, without further conforming to what I perceive as personal preferences on my part.

2/, 3/ and 4/ are *the* processes that can be painfully slow, and really bring me to a halt sometimes in my building. Here third-party assistance is desirable to me.

4/ is already within the scope of the tool, so that leaves us with just 2/ and 3/.

So what is that the tool could do for me as a builder ? Without further petitioning for my "cause", as I've already said more than my piece of pie about it, and Painofdungeoneternal has shown remarkably patient about enduring it, I would just like to add the following :

2/ It's good that the tool will come as open (API) and documented. I don't know if I'll be able to take on the challenge to expand upon it, but I really see a huge potential there for any builder, and I hope it will be taken full advantage of (like the aforementioned "Set Editor").

Being able to properly assess of what is available, annotate, to sort and filter through it, to make the best possible decisions during the "keep/eliminate" stage is an inherent and essential part of any merging process. This does not change for modding games.

3/ I might go against the majority here, but previews that are automatically generated for *every* model, like static screenshots or even animated gifs, would actually provide very little information to me, plus they appear as having a good number of "show-stopping" inconveniences :

- a serious impact on the bandwith (if stored online) and/or drive space (if stored locally) of the end-user
- the inability to display the full extent of the model's features : WOK, PWK, animations, fixes, tintable textures, I'm sure there's more
- the render may or may not correspond to what one actually experiences in game (this already happens with NWNexplorer and the toolset)
- they need to be updated (which could arguably also be part of an automatic process, but process + process + ... ends up in a lot of processes!)

What I do find useful are screenshots of the packages as a whole, ideally accompanied by demo modules/erfs to quickly try them out, like they currently exist on the Vault. NWNexplorer, or MDLviewer (for animated creatures/phenotypes), are usually more than enough for me to get a pertinent perception of the model (through visuals and source). These tools have the merit to already exist, and any builder worth their name should be using them anyway. Furthermore, they could also serve in the cases where the identification of duplicates cannot be automated and requires user intervention. So, maybe would it be sufficient to just plug into these tools locally, and leave in the online "database" only the manifests, comments and any extra data that has been collected by the community ?

So just these two addons :

- being able to categorize, filter and sort content, directly in the tool (without making it a little brother of Excel), to generate or manually feed manifests
- a split NWNexplorer window, visualizing existing content on one side, new content on the other, with the usual NWNexplorer trees (ideally the v11 one, which I can scroll through with direction keys, contrary to the v163 one)

would definitely rock, err, I mean significantly change for better my world as NWN builder.

I apologize for the repeated lengthy posts, but I'm quite passionate about the topic, and equally so about the project that Painofdungeoneternal (and Rolo?) initiated. I happen to be a part of another community (the BG2 one), which has developed tools that seem quite similar to me in purpose : "BWS"/"BWP". I can relate (at least from the outside) what kind of challenge it is to create, and maintain them. It's not my intent to show inconsiderate of what is already "on the plate" for us in the future, just to offer (subjective, but hopefully motivated) feedback. It's also not my intent to further worsen my reputation as local train "derailer", so I will stop here for today.

Whatever comes out of it, thank you again for doing this for the community!
               
               

               


                     Modifié par Nissa_Red, 10 mars 2013 - 02:25 .
                     
                  


            

Legacy_Nissa_Red

  • Full Member
  • ***
  • Posts: 121
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #58 on: March 10, 2013, 02:41:53 am »


               Please don't put me to shame just yet, but to be less "wordy", and a bit more specific about my workflow :

'Posted
To illustrate 1/

'Posted
To further illustrate 1/

'Posted
To illustrate 3/ & 4/

'Posted
To further illustrate 3/ & 4/

'Posted
To further illustrate 3/ & 4/

'Posted
To illustrate 9/

PS : these forums are horribly user-unfriendly with images >.<
               
               

               


                     Modifié par Nissa_Red, 10 mars 2013 - 03:18 .
                     
                  


            

Legacy_Lightfoot8

  • Hero Member
  • *****
  • Posts: 4797
  • Karma: +0/-0
Forum, Downloads, and Info: CEP & Project Q
« Reply #59 on: March 10, 2013, 05:36:02 am »


               

Nissa_Red wrote...

Please don't put me to shame just yet, but to be less "wordy", and a bit more specific about my workflow :

147X148http://thumbnails102.imagebam.com/24237/15061c242365939.jpg[/img]
To illustrate 1/

320x320http://thumbnails102.imagebam.com/24237/1ddf2b242365024.jpg[/img] 
To further illustrate 1/

320x100http://thumbnails101.imagebam.com/24237/a4a2de242365026.jpg[/img]
To illustrate 3/ & 4/

300x100http://thumbnails106.imagebam.com/24237/da80c3242367946.jpg[/img]
To further illustrate 3/ & 4/

300x150http://thumbnails103.imagebam.com/24237/042d20242367948.jpg[/img]
To further illustrate 3/ & 4/

300x300http://thumbnails101.imagebam.com/24237/259d06242365021.jpg[/img]
To illustrate 9/

PS : these forums are horribly user-unfriendly with images >.<


eeeks    Still no good.
               
               

               


                     Modifié par Lightfoot8, 10 mars 2013 - 05:37 .