Well, it appears that special care was given to give credit where credit is due, which is what all the links are for in the description. However, I think just compiling the "final" haks would have been better still.
I am surprised that CEP allowed any of their content to be removed/split from their compilations as they have always been very vocal about their "ownership" of other folks works in the past.
As to Maximus; well, in my experiences with him over the past 9 years or so, he has ALWAYS been a very helpful person, in more ways than the coummity at large has ever officially recognized. He may even be willing to give a port into the db, if asked correctly. IE, just a table(s) name(s) and access to export those tables, OR he may be willing to port them directly (less likely as the bandwith required is going to be HUGE, and would require him to export to himself then send some sort of link to that data OR to directly send the data)
I know I have half a dozen cd's, yes, cd's not dvd's of data that I grabbed from the vault years ago. I would not be able to transmitt that much data across my internet connection without it hogging the bandwidth for at least a week.
Has anyone considered a time limit or data limit on how far back you are willing to "mine" the data? I mean, there are haks up there from the very beginning of NWN, and I would suspect that most of those haks have had no traffic for years or have been superceded by much more recent uploads. Of course, having said that, I also know that there are some gems burried back then that are still worth saving that may still require updating/fixing.
As you all know, I have always been tileset specific in my searches for data, and have saved mainly tilesets. I have many of the haks, with whatever documentation was included in them, but have no backups of the original postings on the vault. Gawd, I wish I could have kept the CTP plugging away, we had a huge amount of content that never got finished and released. A large section of the "extra" work we had done has been lost, but I still have the "original" files stored away on cd's with much of the "original work" that was performed by the early CTP team. (I lost the interim work that was performed by the "middle" team during CTP's life cycle, but still have most of the "end" stages etc.
Anyway, back to THIS project. We are talking in the range of 150-200 gig, possibly more, of data to mine. I don't care how fast your internet connection is, that is a HUGE amount of bandwidth, and it WILL set off alarm bells for any ISP out there. Downloads to a personal pc/location is one thing, but when you start "sending" that much data to a centralized location, your ISP may, and likely will, cut or slow down your internet connection on a monthly basis.
Have we figured out how/where, exactly, the data is going to end up? I know, I know, you have this Drupal site, but from my recent experience in posting there, Drupal is going to take a huge amount of re-editing of posts to get formatting to work. Much less xmitting all the actual hak files.
Please excuse me if I missed some notes on this over all the posts for this project, but I wish for this project to succeed and am just making sure we are ALL considering the amount of real data that we are attempting to save along with the "dangers" involved.
quick recap:
1) Age of files to save?
2) Size of files to be saved (along with posts etc.)
3) Formatting: Which to my knowledge has not really been addressed yet?
4) Possible direct DB access for direct export and to where exactly?
5) Editing - Reformatting of all that data?