KatsBits Community

[Unreal][U1] Exporting .T3D files from Blender to Unreal

0 Members and 1 Guest are viewing this topic.

Offline Asaeis Wi Vio

  • Newbie
    • Posts: 31
Heya,

I posted on this forum a few times some years ago, perhaps somebody can help me with this issue I'm having:

I'm creating the organic geometry for an Unreal project myself and some others are working on, using Blender as that is both faster, more flexible and more powerful than Vertex editing within Ued 2.1. I export my geometric structure (the static mesh/model thing) from Blender as an .obj, import that into Unreal as a static mesh. I then have the opportunity to convert that SM into a brush (which is what I would like to do, so that i can have lightmap based lighting on my world object rather than blotchy vertex lighting which Unreal currently has). However, when converting I receive several error notifications regarding the model - >

---------------------------
Message
---------------------------
Critical Error: FPoly::Finalize: Normalization failed, verts=3, size=0.000000
---------------------------
OK   
---------------------------

Regardless, the model converts anyway, although has quite a few BSP holes in it (which can result in the player being blocked by invisible collisions in such case). Myself and a few engine specialists, including the author of the latest Unreal 227 patch agree that this most likely would have something to do with the error I'm experiencing. Perhaps there's another way to export the model from Blender to .t3d?

Another person who uses 3dsm told me that he uses the ASE tools in order to convert models directly to .t3d, and it's expected that Blender would most likely have this plugin aswel. I managed to locate Goofos' ASE plugin on the downloads section for gimp, but for some reason gimp doesn't load the plugin in the list of plugins, I assume this must be for an incompatible version of blender which is as to why they haven't/aren't being installed.

My question is: How else could I export the model as .t3d from Blender in order to import it into UEd? Or another question might be if anybody has any knowledge about the error mentioned above, and what I would need to do to fix it?

Thanks for the help in advance.





Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3148
    • KatsBits
It appears you're getting floating point precision errors relating to 'off-grid' vertexes. Basically BSP volumes aren't sophisticated enough to properly handle structures that have fractional values, i.e. 0.2872 instead of 1.0 ("1" is the lowest BSP position that should be used as this represents the lowest whole number BSP is comfortable with). When these are encountered it usually results in broken structures or collision errors - missing faces, collapsed vertices, broken faces, invisible surfaces, infinite volumes.

As there doesn't seem to be a viable *.t3d exporter for Blender you could continue on your current path but making sure, in Blender, to snap your models vertexes to an Unreal compatible grid spacing (its similar to GtkRadiant/Radiant iirc so try a test using these settings or these for 2.49 or below) - this ensures the model you export is at least structured so each vertex is valid relative to what UEd1 can process and convert to BSP volumes.

To help in doing this you may also need to rethink your mesh/model slightly and stick to structures that aren't quite so 'organic', for instance where you have a larger polygon, remove the vertex that sits in the middle (splits the diagonal, you want this '[/]', a quad broken into two tris, instead of this '[X]' the same quad broken into four tris because of a central vertex split).

Also, try sticking to completely solid structures in Blender. In other words, you're modelling using 'volumes' rather than a traditional mesh 'skin'. Again this does require thinking a bit relative to what you're trying to produce as an organic shape.

Alternatively, export to ASE and then convert the resulting model to T3D using the ASE2T3D converter (which is probably what your colleague was referring to).

Nice model and object by the way.


Offline Asaeis Wi Vio

  • Newbie
    • Posts: 31
Heya,

Hmm that's odd, I was sure I set blender while working on this to only obeid by an Unreal style grid of about 16 units, and also, afterwoods made sure that all vertecies were snapped to grid.

Although when i do import it (or convert it to brush, i should say), all of the vertecies are offset from the grid? Not sure what's going on there, but that's probably a problem, although dispite being offset, once snapping them to grid within Ued 2.1, all the vertecies are properly snapped to the grid, perhaps the relation between the origin and vertecies are off, but if so I'm not sure how that happened.

I'll check out the ASE/convert to ASEtoT3D exporter, thanks for the link.

I'll tell you here if I have another problem with this process/the result in the direction of fixing the problem, thanks alot for the help and for the fast response.


Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3148
    • KatsBits
Vertices shifting about might be to do with precision issues again; if vertices are interpreted as off-grid when being parsed for conversion they may be forceably repositioned to 'valid' coordinates by the process (relative to their original location and what the converter considers 'valid') - effectively the converter is 'snapping' them to valid positions relative to the editor grid. If that happens it again may result in broken brush volumes, gaps etc. Regarding the "16" value, is that the "Scale" or "Lines" setting? . If it's the latter that won't have any bearing on the precision of the model so make sure the value relates to the "Scale" setting (with "Subdivisions" set at "8") - Blender default grid settings aren't wholly valid/usable for BSP work.

Basically doing what you are doing is pretty tricky to get right because of fundamental differences between the way models are used versus brush volumes - you can be more abstract with models but not s much with volumes so you have to adjust the way you work to suit. If you haven't seen this tutorial by sock give it a read in context to some of the things discussed above, it'll help a great deal.


Offline Asaeis Wi Vio

  • Newbie
    • Posts: 31
You were right regarding the grid, the subdivisions were set to 16 and the scale was set to 8, I have adjusted it now and have snapped all vertecies on the model to the grid. Alot of bsp holes have been eliminated, just there are about 2-3 left. I also tried that ASE to .t3d converter and I don't acquire the same errors. :)

Also, to note, all of my faces have been [/] ified (ie triangulated)

What are the difference between volume models and mesh skins, I must ask?


Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3148
    • KatsBits
Basically BSP (Binary Space Partitioning) is an optimised game world based on volume, i.e. three-dimensional areas defined by a set of boundary coordinates. For Unreal BSP is 'negative' in that volumes are carved from the world, whereas idTech is 'positive', volumes are placed into a void. Both create blocks and areas. Models on the other hand are 'skins', they're essentially a collection of linked 2D plains manipulated into a shape resembling something.

A Cube for example as a model, has 8 corner vertices that are able to describe 6 plains / 2D faces (three of which will share one vertex at a corner) that just happen to form a closed mesh. As a volume, those same verts and faces define a volume, an areas bound by those elements. The difference largely boils down to this: a mesh can have openings and gaps, a volume can't, they always have to be solid (if they are not they're no longer volumes).

And this is basically where the problem lies, in converting models to brush volumes, each 2D face has to be given 'depth' or 'volume' to be meaningful in BSP world space. The extrusions that occur to facilitate that are where errors tend crop up (notwithstanding grid and other issues already mentioned). This means if you're making something specifically for converting to brushes you ideally need to structure the model so its made from blocks and volumes.


Very simple rock outcrop in Blender. at face-value it can be made in a number of way. However depending on the final usage it will need to be either a 'skin' or 'volume -based' object.


If the object (simple rock outcrop) is to be a model it can be constructed as normal, a series of 2D plains shaped and manipulated to form a structure. Final object does not need to be 'closed' or 'sealed'.


If the outcrop is to be converted into brush or BSP volumes it should be constructed in Blender from a series of solid blocks and shaped relative to using grid snapping to a valid Unreal grid. Each object must be sealed/closed.


Offline Asaeis Wi Vio

  • Newbie
    • Posts: 31
Basically BSP (Binary Space Partitioning) is an optimised game world based on volume, i.e. three-dimensional areas defined by a set of boundary coordinates. For Unreal BSP is 'negative' in that volumes are carved from the world, whereas idTech is 'positive', volumes are placed into a void. Both create blocks and areas. Models on the other hand are 'skins', they're essentially a collection of linked 2D plains manipulated into a shape resembling something.

A Cube for example as a model, has 8 corner vertices that are able to describe 6 plains / 2D faces (three of which will share one vertex at a corner) that just happen to form a closed mesh. As a volume, those same verts and faces define a volume, an areas bound by those elements. The difference largely boils down to this: a mesh can have openings and gaps, a volume can't, they always have to be solid (if they are not they're no longer volumes).

And this is basically where the problem lies, in converting models to brush volumes, each 2D face has to be given 'depth' or 'volume' to be meaningful in BSP world space. The extrusions that occur to facilitate that are where errors tend crop up (notwithstanding grid and other issues already mentioned). This means if you're making something specifically for converting to brushes you ideally need to structure the model so its made from blocks and volumes.

Ah, when you said volume model and mesh skins I thought you were refferring to something else. In Unreal 'brushes' have the capacity to be much more complicated fluid than such as in the idtech engines. The brush/model i have going here is sealed, by BSP holes I was refferring to the phenomenon occurable within the Unreal engine, where sometimes the nodes on surfaces don't get compiled properly, end up leaving a hole through geometry, a HOM, or a kind of 'non-space' (when, where touched things instantly get killed or get deleted if they have the property to collide with the world).

Unreal has the capacity to have both subtractive space, subtractive brushes, additive space or additive brushes, (negative and additive BSP brushes/negative and additive BSP voids as you mentioned) rather than strictly one :P

The example shown as a brush(s) in the last picture are actually not healthy for the way Unreals' BSP system works, in a sense that if you were to make something complicated with bare primitives it would cause more strain for the compiling process and more geometric instability, so after construction it's often customary to make 'large objects' into single brushes so that the game engine potentially handles them better (if the author puts them together relatively right).

So, BSP holes are sortof the downside to being able to have much more complicated and larger maps, sadly.

So that being said, there's probably some other issue causing the BSP holes, one can be as you said where vertecies do not go on the proper grid co-ordinates, but other things can trigger this too, just that apart from the grid problem (which I have adjusted in blender) there has to be something causing the geometric instability, just wish I knew what however.


Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3148
    • KatsBits
You'll need to do a bit of experimenting using some simpler structures... it may just be the relative complexity of the mesh being converted that's causing the problem. If that's the case it may be a better option to split the model into sections in Blender and then essentially stitch it back together in UnrealEd once the parts are loaded.


Offline Asaeis Wi Vio

  • Newbie
    • Posts: 31
Thanks for the help, I'll try to see what I can do with splitting the model up. Can't touch the middle section too much as I have a cave in there :P

Either way, when the subsequent 227 update is released, it might clean up the BSP compiling system to better optimize the geometry, hopefully it should be improved enough to fix some issues (hopefully).


Offline Asaeis Wi Vio

  • Newbie
    • Posts: 31
Hey again Kat,

After some experimenting with exporting world geometry to UnrealEd from Blender, I noticed that the suggested grid spacing of Scale: 8 and Subdivisions: 8, doesn't actually match up with the lowest grid spacing of Ued, this could be why I've been having various problems with the BSP, after some fiddling (and meticulously counting grid spaces in the Ued orthagonal view/Blender orthagonal view), I have discovered that the actual setting for blender should be Scale: 16 and Subdivisions: 4. I thought of mentioning this here so that anybody else looking for the answer of how to properly set up an Unreal style grid in Blender could find it.

If you have any Unreal engine 1 relevant tutorials about importing world geometry from Blender, it might be worth updating the Scale: 8 and Subdivisions: 8 configuration to the configuration mentioned above. :P


Offline kat

  • Administrator
  • Hero Member
  • *
    • Posts: 3148
    • KatsBits
Hmm... 8:8 shouldn't give you issues in terms of grid accuracy (snapping to *an* increment of the Editors grid) because they are divisible 'power of two' values (2 x 8 = 16 and 8 / 2 = 4) on the same sliding scale as the ones you ended up using. If you're trying to match something specifically though then yes, increase/decrease values to suit (8:8 are generic 'BSP' values). The best way to know for sure is to export a standard block from UnrealEd that's snapped to, and fits, the grid needed that can then be imported into Blender as a reference. Glad you got there in the end though ;o)


Offline Asaeis Wi Vio

  • Newbie
    • Posts: 31
Hey again,

Eventually I decided to go with using static meshes for all of the organic geometry featured in the project, as meshes are capable of vertex lighting, and static meshes are capable of lighting verteces depending on whether they can be raytraced to a light source (which is useful). Myself and others have made alot more progress on this project since I last posted; I might post some shots in the WIP section or something.