Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

Author
Message
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 7th Nov 2012 20:26 Edited at: 7th Nov 2012 20:27
I had this idea during summer that sounded spot-on back then, however now, three months later when I finally sat down to implement it I'm running into some issues.

Assume I have a large texture that in itself contains several smaller textures (these are all on identical squares for simplicity; assume one 2048x2048 texture map that contains 4 by 4 textures of 512x512 pixels each).


Now I want to "scale" one of these smaller textures so that it repeats over a surface. Let's say two times.
I figured I would just make the u- and v-steps twice as large and then repeat from the initial value once they get larger than what confines the desired texture. However as I thought this through today won't that result in the entire texture being rendered (in reverse) between the vertex having the "last" UV coordinate and the following one where it starts over again?
If so, are there any suggestions as to how to go about this?

Thanks in advance,
Rudolpho


"Why do programmers get Halloween and Christmas mixed up?"

Attachments

Login to view attachments
Chris Tate
DBPro Master
16
Years of Service
User Offline
Joined: 29th Aug 2008
Location: London, England
Posted: 7th Nov 2012 23:44
I'm not 100% sure what you are trying to do; but I don't think you can have more than one "UV scale" on a given vertex; therefore it is probably impossible to repeat a section of an image on a polygon without shaders. I could be wrong, who knows.

With a shader, you are in control of what parts of an image get drawn how often on a per vertex and even per pixel basis. With hard code, you are not.

Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 7th Nov 2012 23:55
Yeah, you can't scale a texture on a texture atlas without using a shader. You'd either need to duplicate your geometry, have each texture use it's own texture or use a shader.

Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 8th Nov 2012 00:00
Basically I'm trying to be clever about allowing multi texturing of a terrain-type mesh beyond the eight texture stages allowed.
The plan was to use this technique on four texture layers and then be able to switch the actual textures at smaller sections of the full mesh.
With that said, yes, the UV's can indeed be different for each of these texture stages.

I am thinking about doing it using a shader, but there are an extremely low amount of actually useful tutorials on those about; most things you do find are either for GLSL or are hid away in obscure subsections of XNA tutorials.
My main problem with doing this using shaders at the moment is that I have no idea how to access the vertex data of adjacent vertices as I suppose I would have to:
ie. if this vertex has a U coordinate of 0.5 it should wrap back and use the texture at U = 0.25 instead, but only if the next vertex (along the U-axis) has a U-coordinate less than the current one. (Otherwise it might be that the current one is the first and we should really step up to 0.75 before changing and so on).
Is there any way to do this?


"Why do programmers get Halloween and Christmas mixed up?"
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 8th Nov 2012 00:13
Even with a shader there will probably be some annoying complications. For example, there will be problems with filtering, mipmapping, wrapping and clamping - unless your application needs some of these turned off anyway.

When UV values near 0 or 1 are required the application needs to know whether the texture read has to be clamped or wrapped around. Clamping won't work unless your sub-image happens to be on the edge of your main texture and then only one or two edges will be clmped. Wrapping, which sounds like the one you want, won't work at all because it will be the main texture that gets wrapped rather than the sub-image.

Similarly the usual filtering that takes place will give problems along the boundaries between sub-images because instead of using clamping or wrapping of the sub-image the texture read will be using pixels from neighbouring sub-images.

Somewhat more complicated difficulties arise with mipmaps because mipmapping gets triggered along the edges when it's not needed.

Some of these difficulties can be mitigated by making each sub-image slightly larger than the sub-image that's required. For example, your true sub-image might be 500x500 pixels but a wrapped (or clamped if that's what you need) border is added all round to bring it up to 512x512. That might reduce the worst of the effects.

But as Chris Tate says, you'll probably need a shader unless your object has a simple structure which enables you to use the vertexdata commands.
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 8th Nov 2012 00:16
Quote: "My main problem with doing this using shaders at the moment is that I have no idea how to access the vertex data of adjacent vertices"


I'm fairly certain you can't. Pretty sure the GPU can't do that. As you said you'll need to get clever with the info you pass in UV, diffuse, vertex position to calculate which area of the texture you're mapping to. I'd probably use a hard coded number of texture in my atlas (e.g. 8x8) then you could determine which texture you're mapping to from the UV data.

So anything less than 0 - 0.125 on the U would be in the left most column of textures. If it's 1 - 1.125 it could have a scale factor of 2, obviously all that worked out with maths, not hard coded. Alternatively you could store the scale factor in the diffuse colour perhaps.

Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 8th Nov 2012 00:18
Quote: "My main problem with doing this using shaders at the moment is that I have no idea how to access the vertex data of adjacent vertices as I suppose I would have to:
ie. if this vertex has a U coordinate of 0.5 it should wrap back and use the texture at U = 0.25 instead, but only if the next vertex (along the U-axis) has a U-coordinate less than the current one. (Otherwise it might be that the current one is the first and we should really step up to 0.75 before changing and so on).
Is there any way to do this?"


Are you now saying that a given surface will be using more than one sub-image?
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 8th Nov 2012 00:46
Quote: "I'm fairly certain you can't."

Yes, that's what I expected as well.
Thanks for the ideas, something worth thinking about

The mesh is pretty easily editable when it comes to its vertex data; basically its just a big plain segmented into smaller sections using limbs. For each such limb the vertices are sequentially ordered from the bottom-left. There would be no problem writing UV coordinates that wrap around once they reach some set maximum value for the desired sub-image.

Quote: "Are you now saying that a given surface will be using more than one sub-image? "

I suppose, although on different texture stages (that part already works).
What I mean is that if I am to access the sampler2D at offsets from the position given in the pixel shader input I need to know when I am to make a jump within it to achieve the sought-after wrapping effect.

Also I wonder, are the UV coordinates sent to the pixel shader automatically filled-in to cover the in-betweens that are not in the vertex data? I believe they are. In that case, should I just write wrapping coordinates to the vertex data of the mesh, won't the "filled-in" values be spanning backwards to the lower coordinate again? (For example the last U coordinate is 0.125, the next one is back at 0.0; won't I get an interpolated value ranging from 0.125 - 0 for the pixels lying between the vertices that have these two endpoints?)
Is this perhaps where Fallout's suggestion about sending extra "scaling" information comes into play to determine when this happens and try to smooth it over?


"Why do programmers get Halloween and Christmas mixed up?"
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 8th Nov 2012 01:11
Quote: "(For example the last U coordinate is 0.125, the next one is back at 0.0; won't I get an interpolated value ranging from 0.125 - 0 for the pixels lying between the vertices that have these two endpoints?)"


Not necessarily. It all depends on the mathematical intrinsic function you use in your HLSL code - and on what values you pass from the vertex shader. Unless someone beats me to it I'll give you a basic example tomorrow (by basic I mean without all the fiddling about with corrections for edge effects).

There was a long discussion about this sort of thing a couple of years ago or so between me and Atom R on the Learning To Write Shaders thread. I'll edit this post with the dates tomorrow.
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 8th Nov 2012 01:15
Ah, I see.
Thank you, looking forward to it


"Why do programmers get Halloween and Christmas mixed up?"
Fallout
22
Years of Service
User Offline
Joined: 1st Sep 2002
Location: Basingstoke, England
Posted: 8th Nov 2012 08:52
Quote: "Also I wonder, are the UV coordinates sent to the pixel shader automatically filled-in to cover the in-betweens that are not in the vertex data? I believe they are. In that case, should I just write wrapping coordinates to the vertex data of the mesh, won't the "filled-in" values be spanning backwards to the lower coordinate again? (For example the last U coordinate is 0.125, the next one is back at 0.0; won't I get an interpolated value ranging from 0.125 - 0 for the pixels lying between the vertices that have these two endpoints?)"


Whatever happens to the UV coords once they're passed to the pixel shader, you can still modify them. They're just numbers. So even if they are interpolated between 0 and 0.125 and scaling/mirroring isn't calculated for you, you can still do a little bit of maths in the pixel shader yourself to figure out what to draw.

Interested to see what GG comes up with.

Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 8th Nov 2012 13:32 Edited at: 8th Nov 2012 13:33
Here's my first demo. This assumes that all your UV coordinates are in the range 0 to 1.

Notice that the sub-images are correctly aligned to the plain but there sems to be some "bleeding" from the neighbouring sub-images along the edges. You can stop that by replacing these lines in the shader:



to



Unfortunately, turning off the filtering is generally undesirable so you need do something a bit different such as surrounding each sub-image with a suitable border - but that's only a partial fix.

This particular demo has side-stepped the mipmap problem by turning that feature off in the shader.

If this is the sort of use of your main texture you had in mind we can try to do something about the other issues.

Attachments

Login to view attachments
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 9th Nov 2012 01:18
Thank you GG, that gave me some things to think about, not least that handy offset struct approach
As for edge bleeding I suppose it would be entirely possible to pad the individual textures with a few pixels from their opposite sides. What other issues would that approach bring about (as you mention it as "only a partial fix")?

In a way that is the way the main texture will be used in yes, however the problem seems to lie in tiling and/or displaying different such sub-textures on the same mesh.
As previously mentioned there is an automatic interpolation between the UV coordinates of every two (or three, I guess) vertices. I tried tweaking about with the vertex shader to get rid of that and go straight back to the start once reaching the end coordinates, but that didn't work. I suppose there is some component in the rendering pipeline in-between the vertex and pixel shaders that handle this interpolation. I've searched around for a bit looking for a way to turn it off also, but when one thinks about it it doesn't make much sense to turn this feature off - it is quite essential for mapping the correct texels after all.
I eventually decided that it would perhaps be wiser to just set the UV-coordinates of the underlying mesh to (0, 0) through (1, 1) and use those only to retrieve the current position on the mesh and let the pixel shader do all the tiling work on its own. This worked out pretty well; to store information about which sub-texture should be used I used the integer part of the U and V coordinates. That however did not work out too well as it again creates artifacts where all texels between for example 1.0 and 2.25 are accessed if two adjacent vertices have such coordinates. Thus I put in separate members in the vs_out structure to hold the sub-texture ID; the mesh's UV coordinates still use the integer parts to point out the sub-texture but it is now extracted iin the vertex shader pass whereafter only the fractional part of the UV coordinates are forwarded to the pixel shader. There are still some issues with this producing strange image artifacts. I'm not sure why that is so and I would describe the experienced problems more in detail / upload some code to reproduce the problem but I have to get up early in the morning so I'll have to put that off until the weekend I guess.
Anyway, does this sound like an appropriate approach or am I getting lost in the woods?

And again, thanks for you efforts


"Why do programmers get Halloween and Christmas mixed up?"
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 9th Nov 2012 13:01 Edited at: 9th Nov 2012 13:06
Quote: "In a way that is the way the main texture will be used in yes, however the problem seems to lie in tiling and/or displaying different such sub-textures on the same mesh."


Not quite sure what you mean by the second part of that sentence. Anyway, here's the next version of my demo with tiling added. As you can see seams are clearly visible. AtomR and I discussed this issue at length around late 2008/early 2009 on the Learning To Write Shaders thread. It'll take me a while to dig into that discussion but I think the upshot was that you can do something about the "mipmapping" seams if you take explicit control of mipmapping via the tex2Dgrad() intrinsic function.

Quote: "to store information about which sub-texture should be used I used the integer part of the U and V coordinates."


That's how I've done it - only just looked at your post.

Quote: "Anyway, does this sound like an appropriate approach or am I getting lost in the woods?"


You certainly seem to be more or less on the right track - but, unless I've misunderstood what you're trying to do, I think the attached demo does what you want apart from the seams. Sounds like you've only wandered a bit off-piste.

Attachments

Login to view attachments
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 10th Nov 2012 16:17 Edited at: 10th Nov 2012 16:21
Quote: "Not quite sure what you mean by the second part of that sentence."

Basically I want to be able to achieve something like this:

That is a single mesh (plain) that in this case shows four of the sub-textures and tiles each of them two times*.
The information about which tile to display is currently stored as the integer part of the UV-coordinates of the mesh. The example illustrated in the screenshot works simply because the tiles are like so:
(0, 0) (1, 0)
(0, 1) (1, 1).
However should it instead be set to for example
(0, 0) (2, 0)
(0, 1) (1, 0)
there will be issues due to the fact that the UV coordinates are automatically interpolated - therefore I get the tile (1, 0) on a small area in-between the vertices where the one on the left uses texture tile (0, 0) and the one on the right uses (2, 0) (the texture tile is simply calculated as ((int)IN.uv.x, (int)IN.uv.y) as described earlier).
I would use the alpha channel of the diffuse component to store the tile information instead, but I need to be able to have different tiles drawn for different texture stages so that unfortunately won't work.
I'm currently considering just having an array in the shader to carry the tile information but that would lock me at a certain "tile resolution" since you can apparently not have dynamic arrays. I would rather not do this since the intent is to use this technique on terrain-type meshes of varying sizes (depending on scene etc.); just storing the tiling data with the mesh itself (ie. in the UV's) seems like a much more extensible solution, but if it is not possible I guess I'll settle for the pre-configured approach.


Furthermore, thanks for your second example. Certainly a lot more compact shader than mine, I'll look into it further tonight.
As for a solution to the mipmapping issues it would certainly be nice seing to how the mesh having the shader will usually be visible for quite great distances.



-------------
* Actually the tiling is locked to 4x4 for the entire mesh, it does not need to work on a per-tile basis.


"Why do programmers get Halloween and Christmas mixed up?"

Attachments

Login to view attachments
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 10th Nov 2012 18:13
Thanks for that explanation. Will you be wanting to change the tiling pattern in real time or will it be fixed for each run or at least for substantial parts of run time?

If it is fixed then you could have a master map image which stores the texture information. You could then read your texture offsets from that - and you'd have four colour channels available for four texture stages. You would probably need to use image resolutions that were convenient multiples of each other. If you think that would help I'll see if I can put together another demo.

Quote: "Sounds like you've only wandered a bit off-piste."


That seems to apply to me not to you.
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 10th Nov 2012 19:24
Hehe.
Clever idea
Well, for the actual runtime (ie. the game) it won't be updated; would be nice if it could be painted in some kind of real-time in the editor though but that shouldn't need to be much of an issue - I'm using the same approach with a blend map used to blend together the texture stages already and that one updates fast enough.
However won't this fall for the same kind of interpolation problems yet again? Or would it perhaps be possible to use another sampler for that texture that has all filtering turned off and get by with that?


Quote: "If you think that would help I'll see if I can put together another demo."

By all means, it would be greatly appreciated
The one issue I can see (supposing there would not be any interpolation issues as described above) is that one might run out of texture stages; currently I'm using four for textures, one for the blend map and then I suppose I'll need one for shadows and another one for lights (or maybe those two could be merged into a single one?). Still should have enough stages unless I think of something else that requires one further ahead though, so this could very well be a workable solution


"Why do programmers get Halloween and Christmas mixed up?"
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 10th Nov 2012 20:17
Quote: "However won't this fall for the same kind of interpolation problems yet again? "


I'm not sure. I don't really see why you can't have all your UV values in the range (0, 1) with this method.

Quote: "By all means, it would be greatly appreciated"


I'll see what I can do - it'll help me at least.

Quote: "Still should have enough stages unless I think of something else that requires one further ahead though, so this could very well be a workable solution"


Yes, it's surprising how texture stages get used up and there are only 8 in DX9.
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 10th Nov 2012 21:40
Quote: "I'm not sure. I don't really see why you can't have all your UV values in the range (0, 1) with this method."

That's possible yes, but I mean if you take a small texture and stretch it over a large surface the pixels tend to get smoothed along the edges. I'm not sure if this is done on the sampler level but I would imagine so.
Anyway, I tried it quickly with another sampler and no filters and it seems to work perfectly for reading exact pixel values from the "tile map"

Quote: "I'll see what I can do - it'll help me at least."

Hehe, only benefits then. Looking forward to it, if you could work that mipmap magic into it that would be awesome.
(The [url=msdn.microsoft.com/en-us/library/windows/desktop/bb509679(v=vs.85).aspx]tex2Dgrad documentation[/url] isn't all that helpful sadly).


"Why do programmers get Halloween and Christmas mixed up?"
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 10th Nov 2012 23:54
Quote: "That's possible yes, but I mean if you take a small texture and stretch it over a large surface the pixels tend to get smoothed along the edges"


I really don't know what you mean by that. With vertex UV values in the range 0 to 1 you can have the texture repeated as many times as you like by using tiling variables in the shader - as in my demo posted earlier. I'm not sure what the edge effects are that you're talking about unless you mean the ones I've been talking about - or are you simply talking about the filtering that's used when a texture pixel covers several screen pixels? But that has nothing to do with the range of UV values used.

Quote: "if you could work that mipmap magic into it that would be awesome"


I think I've got that part working now but I need to check carefully. Hopefully I'll have a demo of that posted tomorrow.

Quote: "(The tex2Dgrad documentation isn't all that helpful sadly)."


Very true.
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 10th Nov 2012 23:59
Quote: "or are you simply talking about the filtering that's used when a texture pixel covers several screen pixels?"

That would be it yes.

Quote: "But that has nothing to do with the range of UV values used."

No, but now it was about getting the exact colour values instead; I figured those might be filtered before arriving at the pixel shader?
Anyway, that didn't turn out to be any problem after all as described in my last post


"Why do programmers get Halloween and Christmas mixed up?"
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 11th Nov 2012 00:18
Quote: "I figured those might be filtered before arriving at the pixel shader?"


That will be either the minfilter or magfilter texture read in the shader.
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 11th Nov 2012 21:52
Ok, here's my final demo till Friday 16th.

This shows one way of using a texture atlas of tiles with a texture map. The texture map uses the red and green components to give the UV offsets into the texture atlas (note that I erroneously said yesterday that you could use four texture atlases per texture map - as soon as I started coding it I realised you could only use two since you have two offsets per atlas - unless you are prepared to do more messing about ).

The demo shows the texture atlas on the left hand side of the screen, the final textured object in the middle, and the texture map on the right hand side. You should also be able to see undesirable seams in the middle image. The demo assumes that you want the final object to be tiled 3x3 - you can change this in the demo.

I suspect the seams are caused by at least two things:

1. the linear filtering
2. numerical inaccuracies in the calculation of the offsets.

Also, I'm not sure whether the gradients have been handled correctly yet.

On that last point, my first version of this demo had "tex2D" instead of "tex2Dgrad" by mistake. As far as I can tell the two versions do the same thing when the gradients are supplied as in this line:

Out.col = tex2Dgrad(baseSample, UV, test * ddx(In.dUV), test * ddy(In.dUV)); // attempt to remove mipmapping seams

- and that's suggested by the DX9 documentation although not explicitly stated since the Help file entries appear to be the same.

[The "test *" bits in that line are purely temporary test code to see what's going on. ]

The next step is to surround each sub-texture with an appropriate border. This should deal with point 1 above - and at the same time tell us whether the gradient version of the texture lookup has succeeded.

Hopefully I'll find time to deal with that at the end of the week (I'm busy with other things till then ).

Attachments

Login to view attachments
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 16th Nov 2012 00:33
Sorry for the late reply, am pretty busy during the weeks.

Yes, that is indeed what I was looking for. I did manage to tweak together something achieving similar results during last weekend but your approach looks a lot more... compact as usual, will look through it when I get the time to see how I might effectivize my own shader / see if I might have done something completely wrong. Also thanks again for the mipmapping solution; that I very much doubt I would've figured out by myself

It eventually dawned upon me that it is not necessary to have a separate texture stage for each composite layer (I'm also doing blending) with this approach, so that reduced the risk for exhausting the stage capacity quite a bit (six to now three stages).

Quote: "note that I erroneously said yesterday that you could use four texture atlases per texture map - as soon as I started coding it I realised you could only use two since you have two offsets per atlas - unless you are prepared to do more messing about"

It isn't very messy at all; unless you have more than 256 individual subtextures it fits very well into a single colour channel; store it as (tileY * tilesAcross) + tileX and retrieve as tileX = colourChannel % tilesAcross, tileY = colourChannel / tilesAcross


"Why do programmers get Halloween and Christmas mixed up?"
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 16th Nov 2012 12:12
Quote: "It eventually dawned upon me that it is not necessary to have a separate texture stage for each composite layer (I'm also doing blending) with this approach, so that reduced the risk for exhausting the stage capacity quite a bit (six to now three stages)."


Very true. You could put everything into one big texture. The downside is the extra coding needed to read the right bit of the texture - and the limit on overall texture size of course. But yes, you could certainly do a lot with just three or four atlases.

Quote: "It isn't very messy at all; unless you have more than 256 individual subtextures it fits very well into a single colour channel; store it as (tileY * tilesAcross) + tileX and retrieve as tileX = colourChannel % tilesAcross, tileY = colourChannel / tilesAcross "


Good suggestion.

The only thing I was thinking of adding now was a border around the images to eliminate, or at least reduce, the remaining seams. Anything else? I hope to have time later today to look into this.
Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 19th Nov 2012 11:13
Quote: "The only thing I was thinking of adding now was a border around the images to eliminate, or at least reduce, the remaining seams."

I suppose that's it
I can probably achieve that myself if you're busy, no worries; I guess all you do is calculate the border width (as a percentage like all other coordinates) and add / remove it to the sampling positions? Or is there some fancy way to achieve it in a quicker way maybe (I was thinking of maybe having a pre-defined matrix for lookups of start- and endpoints for the individual subtextures)?


Quote: "Anything else?"

Hm... got any clever mass-foliage ideas?
I had planned on doing it the simple way by using some base meshes with, for example, 20 pieces of grass each and then instance these about, but I realized that you can't change the individual limb offsets of an instanced object so that won't work well (having a separate object for each such piece of grass / what-be-it unfortunately causes great slowdowns, likely because of each one resulting in individual draw calls). Came to think of these animated ground cover shaders found in newer (commercial) games; I won't need anywhere near that density or any animation but the pure existance of such shaders seems to imply that it should be possible to render a mesh in multiple places at once... or doesn't it? Perhaps a loop wherein you change the WVP would work, but I get the feeling that might be slow?


"Why do programmers get Halloween and Christmas mixed up?"
Green Gandalf
VIP Member
20
Years of Service
User Offline
Joined: 3rd Jan 2005
Playing: Malevolence:Sword of Ahkranox, Skyrim, Civ6.
Posted: 19th Nov 2012 12:01
Quote: "I suppose that's it"


I hope so.

I'm away again till later in the week. I'd hoped to finish this off before now but should have time to spare this coming weekend. I'll post a simple demo then anyway - but as you say the changes should be minimal.

To save time I'll use your images as they are since they, in effect, already have a border. Of course, for real textures you'd need to fiddle with the border a bit. For example, your seamless texture could occupy, say, 480x480 pixels, with it wrapped to fill the remaining border up to the full 512x512 pixels. It's one of those things where you can't be sure how successful it's going to be till you try.

Quote: "but I realized that you can't change the individual limb offsets of an instanced object so that won't work well"


You could use a small set of grass objects and randomly instance those and give them random rotations etc. That might be enough to destroy any obvious repetition which I assume is what you're worried about.

Anyway, bye for now till later in the week.



Rudolpho
19
Years of Service
User Offline
Joined: 28th Dec 2005
Location: Sweden
Posted: 20th Nov 2012 02:13
Quote: " For example, your seamless texture could occupy, say, 480x480 pixels, with it wrapped to fill the remaining border up to the full 512x512 pixels."

Think it will need to be that much? I would have guessed maybe 2 - 4 pixels should do (based on how the seams look)

Quote: "That might be enough to destroy any obvious repetition which I assume is what you're worried about."

True, but the main issue is getting the individual parts to not end up below / above non-flat ground. It can probably be solved well enough by thought through level design when it comes to it though.


"Why do programmers get Halloween and Christmas mixed up?"

Login to post a reply

Server time is: 2025-05-21 04:31:31
Your offset time is: 2025-05-21 04:31:31