KatsBits Community

Preserve "brushes" in Blender

0 Members and 1 Guest are viewing this topic.

Offline James Stortz

  • Newbie
    • Posts: 4
I've imported a functional Q3 map into Blender and then exported it right back out. The issue is that solid 3D "brushes" have been replaced with many 2D triangles. Is there a way to preserve them upon importing/exporting?

I have a pair of importer/exporter scripts for Blender 2.49.
 - Import: .aes (I first convert the map to AES)
 - Export: .map

My thoughts:
 Maybe the .aes conversion is what hollows brushes into triangles?
 Maybe Blender has a function that solidifies?
 Maybe there's a better .map exporter that solidifies back into brushes?

Reference:
 Top-Bottom: the default map, importing, exporting, and compiling.



Thank you :)


Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3145
    • KatsBits
A couple of points to note on this:

1) if you import a decompiled BSP level you are basically going to be working with plains and polygon 'sheets' as everything is broken down into raw triangles (tris) that can't be easily reconstituted as brush volumes in Blender, at least not without a bit of effort to reconstruct.

2) although you can import the model into Blender, the mesh sections then available are not all going to be 'convex' volumes - looking at the very top image you posted it's possible to see the selected brush volumes are simple shapes that collected together form the shape of the platform. In Blender the same area looks to have been imported as an nGon (second image). Those can't be used in their raw state and need to be broken down into similar shapes as those shown in the first image to work effectively within Blender.

So your options are (assuming you need to export brush volumes from Blender);

1) each surface needs to be broken down into 'valid' structures - primitive shapes basically. Essentially you have to use or think about Blender the same way you would GtkRadiant or other brush volume editor; just as you can't make complex compound (concave) shapes in Radiant, you can't do that in Blender and export a successful *.map file. So each compound shape needs to be comprised from several primitive shapes aligned to form more complex objects [this is IMPORTANT!].

2) for successful export to *.map, flat 2D plains need to be 3D volumes. Ideally you want to be manually extruding volumes from the imported structures (basically rebuild the level to suit BSP/volume requirements). Alternatively it is possible to force extrude on export but that can be a tad unreliable because it has to extrude ALL surfaces by the same value (at least "8") so you may overshoot and cause inverted and invalid volumes (which crash/stop the export process).

With that said... as you're able to open the level in Radiant, rather than compiling and converting to ASE, try exporting an OBJECT (*.obj) model from the editor as that ignores all texture information and tends to keep brush volumes intact. You'll need Radiant 1.5 for that as that's the only version that has OBJ export iirc - it may be possible to convert to OBJ using Q3Map2 but doing that converts optimised structures only - the level with all the junk removed (volumes parsed as surfaces and plains).

Have a read through the "Brush-based Structures in Blender" to get a better understanding of what's going on when using Blender to export MAP files. Also have a read through "Prepping meshes for Map Export".

Alternatively, you could break the mesh into sections in Blender and export each as a textured ASE model, which then requires copious amounts of playerClip and caulk hulling to make the level playable and compensate for models not blocking VIS etc.


Offline James Stortz

  • Newbie
    • Posts: 4
Thank you so much, Kat! This helps me out a ton!

So I may go with *.obj import, since *.aes import is inherently triangle based.

I appreciate you breaking it down like that, and it makes a lot of sense. I am new to Blender and didn't realize it had the ability to use solid brushes. If I'm understanding this correctly, the way I can import them successfully, as *.obj, means I will also lose texture information, and have to re-apply in Blender. (I assume it keeps the texture information from that point, maybe with path discrepancies.) If I import with *.aes then I have to undo a lot manually, rebuilding brushes, patching, clip masking, etc.

Now I know in Radiant I can select a brush or surface and inspect the texture information. Would it make sense to respectively re-apply textures in Blender? (Would the rotations/locations/scales hold?) We also know the *.map file must store that texture information. So, on that note, it might be worthwhile for me to program a helper script to automate that. Which brings me to the next point, what about shaders? I assume Blender wouldn't recognize them, and finalizing in Radiant would allow for touching up textures and shaders anyway.

So,
Code: [Select]
Import Converted Triangles Brushes Textures Shaders Entities
*.aes *.bsp Triangles ------- Textures ------- --------
*.obj *.map --------- Brushes -------- ------- --------


Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3145
    • KatsBits
Blender doesn't use solid brushes per se, what it's doing is considering primitive shapes, mesh objects like the default Scene cube etc., as 'solid' in the sense that they are closed meshes (don't have openings or holes). This allows the *.map script to then correctly convert them to valid volumes on export.

Re: texture data etc. Yes, using OBJ will strip texture and UV data from the file, you'd be dealing with raw brush volumes so all the shader info would need to be reassigned in Radiant, or in Blender after properly UV mapping everything - it would be quicker for you to export a valid *.map file and then open that into Radiant for texturing as that also ensure the UV/texture coordinates are valid (this doesn't matter if the level is exported as an ASE model, but any discrepancies in a .map may be auto-adjusted by Radiant.

With respect to shaders, if you're texturing in Blender you just need to make sure you're UV mapping the/a corresponding bitmap image to the surfaces you want to appear textured in-game, they should be picked up in Radiant. If not it's relatively easy to do a find and replace swap in the Editor (Radiant).

- ASE:
keeps UV/texture/material data (rotation/placement etc.),
strips (usable) volume information
can be used  as is but requires use of clipbrushes, autoclip key/value entity pairing, or shader based clipping.

- OBJ:
strips UV/texture/material data,
keeps brushes as 'solid' meshes
must be exported to .map or .ase (subject to the above), will require everything to be re-textured and UV mapped.

Regarding the little helper script. It might not be necessary but it depends on what you're trying to do, and the level of skill you have using Radiant or Blender - if you have access to a decompiled BSP and don't need to do anything special (you don't have some considerations you need to keep in mind with the engine you're using), stick with Radiant, it's marginally easier to manage files and still keep everything 'valid'.

There's a lot to unpack in the info provided above, if it's confusing at all include some information on what you're trying to do, i.e., what the end goal is. Should be able to answer more specifically then ;o)


Offline James Stortz

  • Newbie
    • Posts: 4
I believe I am following along here just excellently, and I just admire your expertise. I'm starting to get really motivated about this project! I was going to keep the concept under wraps, but since you've been so kind, I'll share a little bit. :)

| Methodology

Basically, I'm going to design a CGI-quality level, render it with ray-tracing, (i.e. V-Ray,) and generate assets with photogrammetry, (i.e. 123D Catch or PhotoScan.) This will, in theory, yield corresponding models and assets with unprecedented photo-realism, which may need some additional work to, but ultimately, that play in-game at amazing performance!

I don't know why or if nobody has thought of this idea yet, but it was only a matter of time. I came up with this idea months before I had find out about The Vanishing of Ethan Carter on the Unreal Engine and Star Wars Battlefront, which deserve ample credit for their beauty and creativity. These games, however, are not the same as this idea. Photogrammetry is a logical step up from the methodology of applying actual photographs to 3D models. Its main use-case is, famously and impressively, importing actual real-world objects and environments, which is what we've seen so far. This is also its limitation and/or drawback, since the lighting is set-in-stone, no pun intended. However, I believe the combination of technologies, ray-tracing and photogrammetry, sleeps at our doorstep like a workflow in a manger. Since photogrammetry requires a series of photographs taken at angles around the subject in order to stitch them together, (think Google Maps,) that works perfectly for virtual environments that can render them with precision and control.

Food for thought:
Also, I believe, in this manner particularly, it is possible to generate with each texture a light-map, for lack of a better term, that even allows support for any given angle of light. (For instance, a texture with red shadows for overhead light, blue for left-incoming light, green for right-incoming light, etc.) That might make its way into the project later on, after engine modifications eventually come into play.


Speaking of perfectly, here's how this map implementation is going to work. I'm actually going to use the original map, (except with all invisible textures,) as the underlying clip mask. The intention behind the HQ level is to show immaculate terrain, along with a splendid skybox and some atmospheric effects. However, I didn't want to affect the original structures of the map, since that has everything to do with the desired gameplay. So, I'm going to apply the photo-realistic level as "detail" over the top of the original map. See, perfect!

| Thematics

Venturing out into space, men via macro-robotic ships continued the expansion of their civilization, yet again. This time they established an artificial ring around the planet, a space-station, a space-nation, sizable, luxurious, and hospitable for life. Breathing in oxygen supplied by onboard gardens, men throughout their days gazed through either giant encompassing walls of glass into the stars or the Earth below. This was life as they knew it, since generations ago.

When constructing the ring, rather than hauling up materials through the atmosphere, meteors were gathered and mined for their plethora of minerals, chiseled and mounted, the great slabs, into structures right in their justified domain. In doing so, magnificent arrays of rock, dust and gas formed, subsequently decorating the artificial ring just exactly as were the natural rings around our other planets.

This map is either a training arena or a sport arena caught in orbit outside of the ring. That would explain the hovering platforms, decked above the happy spectators inside, amidst the glorious trails of cosmic debris. The sport would be played one-versus-one in the robotic suits, equipped with hand-held laser guns and thrusters. The laser guns not only disengaged their opponent's suit, resetting their position, in order to gain a point on the scoreboard, but also had the ability to blast-off of surfaces, especially when hyper-activated by specially-designed launch-pads. This explains the guns and launch-pads, while the thrusters explain the gravity and ability to move directions whilst already in motion. You see, the thrusters ran a program to simulate gravity, in order to compliment better man's natural athleticism, and could be controlled for additional axial movement in space.

The map is somewhat of a luxurious destination, akin to a beach house or an island getaway, but rather than oceans and waves, we have space and stars, and rather than rocks and sand, we have meteorite rock and dust. Clouds and seaweed, gas and ice. Instead of palm trees, well, I suppose we have teleporters. :) And then, of course, we have... the sun and the moon.




| Story

Now I can share the prequel for this map, for fun! Just like the idea to apply photogrammetry and photorealism together, I'd been kicking around some fantasies about a cinematic story with characters, setting, and all that. It wasn't until I was inspired by QuakeJS that I had the thought to put all of this together!

Since you're a level guru, I might as well outline some of the fun details right here, which I have to do anyway.

So, the story goes that in future, society has decided to re-zone. (Up until this point, in the current day, civilization was developed on a first-come first-serve basis, as land were explored, and then on an as-needed basis, once lands were established.) There was never an initial blueprint, which means it's not possible that the roads and cities just happened to be laid out in the most optimal manner.

Our modern buildings would be left as the "legacy layer," while other layers were built upwards. A work layer, a residence layer, a vacation layer... transportation between hospitals, schools, manufacturing and everything was now as efficient as possible. A collective campus, worthy of an advanced society.

In order to take on such an undertaking, they put to use the technology of manned-robots. These robots were in the shape of men, but on much grander scales, so as to have the most intuitive interface possible. Instead of lugging around materials and constructing with big Cat machinery, pulling levers and wheels, men operated stronger mechanical versions of themselves, quite naturally at that. When they wanted to place a beam, they just picked it up and placed it.

At first, the technology was intended for construction, but soon found its use in many cases, such as military. Robots could be of any size, controlled via exoskeleton suit or virtually, within or remotely, and eventually revolutionized the concept of the vehicle. Before long, everybody had one, of some sort. It became integral. Everything was more intuitive as a human, a superhuman that is. Then came the ability to launch into space, and began the new space age. =)))

(Dotted all throughout here is plenty of room for more backstory development.)


| Conclusion

I'm actually making a single-player RPG out of this, as the map will be one of many destinations of its kind. The game engine is sufficient for platforming the ideas I have in mind, like bots as instructor characters that follow a preset path with guide trails, training your skills with various exercises and tutorials. I don't know if you caught this, but the laser-gun game I was mentioning is actually the classic InstaUnlagged! :) That's why surface clipping has to be in-tact as close to the original as possible. And the joke about why in the Quake physics, we can move around freely in the air. Hopefully, the final result will be, first and fore-most, a fun gameplay, with some updated graphics, that can run in the web browser or at least on about every platform, including mobile.

I think this approach is a modestly good way to revamp a game engine like this, without overhauling it, because at that point, I'd rather start from scratch and go all out with ideas. :)

Anyway, even if you didn't read all this, I hope you enjoyed it about as much as I did! And if you want to say anything pertaining to this approach or anything, just fire away! Let me know what you think!

Thanks!


Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3145
    • KatsBits
Speaking to the technical aspects of the idea: you're wanting to place a modelled skin over the top of a brush based structure and replace all the default textures/materials with new photo-realistic content? If that's the case it might be better to just model the level(s) in Blender.

Here's what you could do; model the level in Blender as normal for Quake 3. That will be the 'clip' hull (this is essentially caulk hulling). Duplicate that and increase the detail where necessary to fulfill the 'photogrammetry' requirements. This forms the 'skin'. Export the former as a *.map file. Export the latter as an *.ase model - this aspect of the process will likely require the 'skin' to be split into multiple sections and elements to make it manageable and for each to fall within the strict limitations of BSP compiling (assuming use of Q3Map2).

Depending on how you're producing texture and material assets from the results of photogrammetry, you may need to UV unwrap and texture the 'skin' models as *models* rather than the way that might be done if they were simple primitive shapes like brush volumes (essentially producing a unified UV map rather than one broken into plains and regions). What you're thinking of doing is certainly possible, it'll just take a bit of figuring out to get the pipeline nailed down satisfactorily to what you're intending to do.


Offline James Stortz

  • Newbie
    • Posts: 4
Thank you for confirming my hopes here! I'm glad it seems to check out.

(I am so sorry if that whole backstory/theme was too... much! I didn't realize in my head it'd turn out so cumbersome. I need to work on that.)

I've heard of a method to import higher poly models with separate skins, so I am totally on the same page with the approach. Those are great ideas, and yeah, it will be a real experimentation, like you say, to get it streamlined. Model in Blender, render with V-ray, photogram it, re-skin, finalize/etc. I will definitely continue to refer to your guides and all your help.

Seriously, haha Thanks. This is awesome!


Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3145
    • KatsBits
Nothing wrong with the back-story, it's just not something one can provide a solution for (not broken so nothing fix). Heh!

Also forgot to add, I'd suggest working an a much smaller level to start with so you know what you're up against in terms of content production as ASE has some limitations that will need to be worked around if you settle on that as the final format of your in-game 'skin' models (iirc models can't be more than 1024 vertices otherwise it produces a MAX_.. error during compile).