Sorry your browser is not supported!

You are using an outdated browser that does not support modern web technologies, in order to use this site please update to a new browser.

Browsers supported include Chrome, FireFox, Safari, Opera, Internet Explorer 10+ or Microsoft Edge.

DarkBASIC Professional Discussion / The Ultimate Shader Thread 2.0

Author
Message
Bush Baby
19
Years of Service
User Offline
Joined: 23rd Apr 2005
Location: A cave beneath Jerusalem
Posted: 11th Dec 2005 00:22
Well , heck yes.
Just look at Quake3, it's shader heaven.

|| Bush Babies are funny little hermits, disguised as rats. ||
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 16th Dec 2005 23:20
So has anyone ever made a rain/snow shader? I just cant imagine some of the weather effects seen in recent games are done with plain old particles anymore, they have to be using shaders right?

All you need is zeal
Ian T
21
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 17th Dec 2005 06:24
Depends on what you mean by recent games... FEAR had a shader for pretty much everything, as do next-gen games such as Oblivion and Gears of War, but pretty much everything before that era, to my knowledge, used particles.

Snow would look nice would some kind of blur effect for the snowdrops, but that'd be expensive as all hell... as for rain, it looks pretty fine just using nice particles.

But I'm not exactly the font of knowledge here .

Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 17th Dec 2005 07:58
Splinter cell (I think even the first one) used a shader for rain. So did Pirates of the Caribbean (and thats old as crap). I think theyve been used even before that. I HEAR they arent that tricky to do (although I have no real shader knowledge so that doesnt mean anything ehe).

With all the talent here SOMEBODY must have tryed/made one before.

All you need is zeal
Ian T
21
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 18th Dec 2005 02:44
Quote: "So did Pirates of the Caribbean (and thats old as crap)"


Okay, now I feel old

Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 18th Dec 2005 08:34 Edited at: 18th Dec 2005 08:35
Ehe did anyone actually play that game (POTC)? It was made by Bethesda (the Elder Scrolls people). The graphics were amazing, the water and weather effects were second to none. The gameplay was retarded however, they only reason I beat it was because of the graphics heh.

Anyway, im really curious to get a experts oppinion on this. If its too hard/whatever to make a nice looking weather shader, ill keep kicking my particle system till it looks like rain.

All you need is zeal
Ian T
21
Years of Service
User Offline
Joined: 12th Sep 2002
Location: Around
Posted: 20th Dec 2005 05:10
Quote: "It was made by Bethesda (the Elder Scrolls people). "


Actually they only published it . They're a publisher as well as a developer, often forgotten as they're very minor league.

I tried out the Splinter Cell Chaos Theory demo again and damn, that rain does look sweet. Wish I knew how they did it.

Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 20th Dec 2005 23:23
Really? I thought they had some hand in the developement too.

Anyway I imagine shader rain is a simple 2d plain billboard, with a hella complicated pixel shader that 'projects' a rain texture based on the angle of the camera. No idea how collision detection would be done (in some of these games you can see the rain splashing when it hits the ground).

All you need is zeal
that dude
20
Years of Service
User Offline
Joined: 1st Jan 2004
Location: USA
Posted: 22nd Dec 2005 01:13
i have two theories of it hitting the ground. the first is a second shader on the ground that coralates with the rain shader.

http://www.nuclearglory.com/?u=fearik = sweet as hell collision system. easy on that leather thing in your back pocket too.
the left side of my head isn't bigger, the right side is just smaller
Olby
20
Years of Service
User Offline
Joined: 21st Aug 2003
Location:
Posted: 23rd Dec 2005 23:30
I think it will be too complicated & too FPS expensive, while DBPro is not the fastest engine that works with shaders out there I will stick to the old fashioned particle system

What's crackin' homie?
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 26th Dec 2005 08:54
Wohoo! I got my cloud shader working (well almost), the problem is, when I apply the effect to my sky dome, it cancles any vertex alpha. My skydome is a a dome (imagine that) and the verts at the very edge have invisible verts, so clouds fade out into the horizon. This is a huge problem.

Heres the fx file, its basically Morcillas blur shader, all I did was modify the pixel shader slightly.

BTW, while youre looking at it, is there anyway to clean it up? I dont understand why I need all the extra crap in the vertex shader (and some stuff in the pixel shader for that matter). ALL I want to do is blend 4 images together (each with different weights).

Thanks in advance

All you need is zeal
Morcilla
21
Years of Service
User Offline
Joined: 1st Dec 2002
Location: Spain
Posted: 27th Dec 2005 16:52 Edited at: 27th Dec 2005 16:55
Zealous, welcome to hlsl I'm not sure of what you want to do, but when one starts shaders, never knows totally what he/she is doing.

This is a reduced version, but I haven't tried it It only uses textcoord0, as the shader only uses one texture coordinates (uv), and adds (with that factor multipliying) the color of the 4 textures at those coordinates:


You were using textcoord1, that was color of the left-top pixel of the current pixel... You can play a LOT once you get it.

The full .fx source, I commented what I thought could be off (but haven't got the time to compile it yet, so try it):



About the alpha transparency, I added some renderstates, but as I haven't test them, feel free to play around to get the effect you want:


Also take care of the alpha value for getting some trasparency, I think it should be near to 0, so the calculus is not bad, but depends on "a":


Tell me if it works or what. It wouldn't be a bad idea to post the .dba and/or some specific media
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 27th Dec 2005 22:17 Edited at: 27th Dec 2005 23:44
Cool thanks for the info! But theres still alot of stuff in your vertex shader. Do I need all this blur stuff? Lets say all I wanted was a bare bones shader. All it did was blend two images together, thats it. What would it look like?

Also, could you explain a little more about what you meant by "You were using textcoord1, that was color of the left-top pixel of the current pixel"? I was using texcoord0 (or was I?), which I thought meant whenever I was working with a pixel, I was working with a pixel based on the uv coords of stage 0 (the one all my textures were using).

- On a side note, about the transparency. Im not trying to actually get transparency in the image, but rather in the VERTS. Is it possible to set the VERTEX alpha in a shader? My skydome HAD vertex alpha around the edges (so the clouds would fade off into the horizon), however as soon as I applied this shader, the vertex alpha was lost .

All you need is zeal
Morcilla
21
Years of Service
User Offline
Joined: 1st Dec 2002
Location: Spain
Posted: 28th Dec 2005 17:24 Edited at: 28th Dec 2005 17:27
Yeah, maybe you could only use the pixel shader and not the vertex shader, as you only want to change the texture. Try it commenting the vertex shader compiling like this:

If it works, then you can take out the BlurVS1 part. If it doesn't, don't worry, it is just calculating the ambient+diffuse light. That's kind of a minimum.

Quote: "I was using texcoord0 (or was I?), "

Well your original code was at the pixel shader:

That's using texCoord1. If you look at the vertex shader, texCoord1 was:

That means the original uv coordinate (texCoord0) displaced u=-pSize, v=pSize). Look at this scheme where

- 0,0 = texCoord0
- ps = pSize = pixel size, or size of the displacement



And about the verts, no idea. You should investigate further. Post some screen, maybe I don't understand you. No real shader expert here, just some experience.

Attachments

Login to view attachments
andre
18
Years of Service
User Offline
Joined: 19th Dec 2005
Location:
Posted: 28th Dec 2005 19:44
i'm a noob to all that

DARK_VIRUS
pm me to help start a game making company!
--------------------if you worry you die and if you dont worry you die. f#%k worry
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 28th Dec 2005 23:12 Edited at: 28th Dec 2005 23:15
Ah so "texcoords" have NOTHING to do with different UV stages? I always assumed when you used texcoord0, you would get a pixel/color based on the UV data for that texture UV STAGE. Likewise for stage 2 3, ect...

So I COULD scale/scroll the uv data of the object itself, or work with different textures on different STAGES (with different uv data), and the pixel shader would continue running happily?

Quote: "Post some screen, maybe I don't understand you"


Just imagine a plain (well more like a disk/dome) with a cloud texture thats floating above the camera. I want the 'edge vertices' to have a alpha value of 255, so the cloud TEXTURE will seem as though its fading out into the horizon. I already constructed a mesh just like this, and it looks GREAT, problem is the vertex alpha data is lost when I apply my shader.

So, in my vertex shader, I need a line something like...

if VertexYPos < 0.0 then SetVertexAlpha(255);

All you need is zeal
Morcilla
21
Years of Service
User Offline
Joined: 1st Dec 2002
Location: Spain
Posted: 29th Dec 2005 13:49
Yeah, nothing to do. If you look at the original code, it only uses one texture, at stage 0. But I could have used as many textcoord as I want, they are just variables.

You are the one stepping ahead using more than one texture, so make your tests to find out, but I guess that all stages share the uv coordinates, not totally sure right now. Uh.

And about the vertex alpha, if you want that, you have to calculate the vertex distance and set it acordingly. Right now it does:

So it doesn't discern based on distance, but on "a", that depends on the red color and the cloudcover variable:


Apart from that, when it calculates the color you wrote:

That means you'll always get a grey-scale image based on red, because "a" depends on the red color.
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 29th Dec 2005 23:39
Yeah I want a greyscale, thats why im only looking at the r channel. Its just for clouds.

But how do I "calculate the vertex distance and set it acordingly"? And again I dont want the vertex alpha to have anything to do with the PIXEL shader (assuming thats possible, it should be). So "a" would have nothing to do with what im trying to do.

I just want this line in my vertex shader...

if VertexYPos < 0.0 then SetVertexAlpha(255)

This will set all the EDGE vertices (which are pulled down slightly below 0.0) to have a 255 VERTEX alpha value. Then the pixel shader goes about its merry way like it has no idea about how things fade off at the edges.

All you need is zeal
Morcilla
21
Years of Service
User Offline
Joined: 1st Dec 2002
Location: Spain
Posted: 30th Dec 2005 11:31
Every vertex position info pass through the shader with:

vPos is a float4, with that you can access vertex position values with:

vPos.x
vPos.y
vPos.z
vPos.w

And calculate distances to (for example) values provided externally with some distance function (you know the tipical sqrt(xx+yy+zz)).

I suggest looking for some shader that already calculates distances and try to see how it is done. I don't have any example near, but that usually appears at shaders books, as it is kind of usual to calculate distances to the vertex.

For what you want, it could be something like:

At the vertex shader. If you want to do anything with vertex, you cannot only use the pixel shader.

You may have to revise syntax, as I do not remember how to write "if's" in hlsl, but it can be done since 2.0 (more or less, lol)
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 30th Dec 2005 23:29
Ok cool so thats how I get the vertex position info, but is there no way to set the VERTEX alpha? You keep setting alpha on a per pixel basis, and that seems like overkill. The shader would run much faster if I could just run a check on each vert, rather than each pixel.

I imagine its just some command neither of us seem to know, thats all I need then I think ill be set

All you need is zeal
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 1st Jan 2006 05:47 Edited at: 1st Jan 2006 05:50
Well I cleaned up my cloud shader and simplified it quite a bit (it doesnt even use a vertex shader anymore!). Take a look...



*edit I was thinking about this too - right now im using a individual texture for each noise octave. Thats stupid, I should just combine each 4 octaves into one image (one for each channel r g b a). Since were dealing with a greyscale might as well take advantage of those color channels. Then I would only have to sample ONE texture rather than FOUR! Look for that in version 2

And I know I will need to implement a vertex shader if I want to get the vertex alpha working again. Although that kinda pisses me off, how simply applying a PIXEL shader (which has nothing to do with the vertices) wipes all the vertex color/alpha data. Is this a bug?

Also I found this cool article on rain/snow, and think it could be 'shaderfied'.

http://www.ofb.net/~eggplant/rainsnow/rainsnow-sketch.pdf

Of course its really designed for a flight sim, and I imagine it wouldnt work in any other type of game. I mean what would happen if you were up close to a building (or any object)? You would still see rain falling in the distance as if there is infinite depth between you and the object youre standing right infront of. Anyway to solve this?

All you need is zeal
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 1st Jan 2006 23:26
In theory, I should be able to get decent cloud animation simply by scrolling the different textures (noise octaves) at different speeds. However now that ive merged the 4 images into one (each of the 4 channels is its own noise octave), thats no longer possible. So what I need now is away (for example) to combine the red color at uv 0,0 with the blue color at uv 0.1,0, ect...

...or perhaps if im going to do the animation this way I need to go back to using a individual image for each octave.

Also I still need a way to set the vertex properties (alpha mainly)

All you need is zeal
Catalyst
20
Years of Service
User Offline
Joined: 6th Sep 2003
Location:
Posted: 2nd Jan 2006 07:55
This is in regards to Neophyte's modified toon shader. It's a nice shader and looks pretty good, except when you have a lot of overlap on your character if you have a heavy stroke. Example using the generic Miko model:



This looks very nice. The stroke isn't perfectly consistent, but overall it's great. But then rotate the object a bit and you see this



Now I've tried messing with the image it's using for the stroke, but it makes it either too thin, or start fading into black, or whatever other problem. So a few options. Easy one is that I'm missing something and there's a quick fix. That would be nice. Or, would it be possible to change this shader so it doesn't have this effect? The stroke in cutting in on the character, it would be nice if it was on the outside. Since it's cutting in, I'm guessing that there's a reason on that and would be hard to modify. The other option is flipping the normals of a scaled copy and colouring it black. Not too keen on that one, but if it gives me the look I want, then it's the one to go for. Any other options I've missed? Anyone else run into this issue or just find a better way to do a stroke?
Morcilla
21
Years of Service
User Offline
Joined: 1st Dec 2002
Location: Spain
Posted: 3rd Jan 2006 18:28
Zealous,
Quote: "The shader would run much faster if I could just run a check on each vert"

That's what a vertex shader does (the vertex part of the shader). It process a stream of vertex, performing all commands for each one of them. So, setting the alpha value for a vertex should be done, as I said before, at the vertex shader.

Catalyst,
it would be nice to see the code that you are using, so we can understand the problem better.
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 4th Jan 2006 03:03
Ah cool I didnt notice that color part was in reference to the VERTEX, that should be just what I need!

Next question, how can you SCROLL the vertex uv data in a shader? DBPro only lets you scroll stage 0

All you need is zeal
Catalyst
20
Years of Service
User Offline
Joined: 6th Sep 2003
Location:
Posted: 4th Jan 2006 06:25
@Morcilla

No code needed. It's just Neophyte's version of the cartoon shader applied to a model, nothing is being coded wrong. Just the result is not giving me what I want. Problem is visible right in the image, the stroke cuts into the model instead of being on the outside of it. Just the way the shader works, it would be great if there was a way to do it differently.
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 5th Jan 2006 01:50 Edited at: 5th Jan 2006 05:26
Well I got a vertex shader to work with my pixel shader. However I cant get the vertex output color to have a 0 alpha value. Check out the vertex portion, and tell me if you see anything wrong



On the plus side, I think I see now how to scroll the uv data inside the shader

*edit well, I know how to scroll stage 0 (simply add a ever increasing variable to the out.uv), but how in the hell do you scroll the OTHER stages?

All you need is zeal
Morcilla
21
Years of Service
User Offline
Joined: 1st Dec 2002
Location: Spain
Posted: 5th Jan 2006 13:15
@Zealous,
Ok, as far as I know you are walking in the right direction. Please notice that you are not using any render states (do you remember?) I suggest doing so to get it working.
Also, don't use 1,1,1 as the vertex color when doing your tests, as the vertex doesn't have any "black" and you could be missing some results.
Even with that, we can be missing something and you might not get exactly what you want, welcome to shaders! If it's done in some other way, I have no idea right now.

And about scrolling the other stages...uh. I suppose that the vertex input struct should come in with more that one "float2 uv" to do that. Vertex flow information must be set within DBPro code.

Also, the VS outputs more info (position, textcoord, color) that the PS receives (only texcoord) so watch out for unexpected results.

@Catalyst,

Quote: "No code needed. It's just Neophyte's version of the cartoon shader "

Yeah, you refer to that famous code in the whole world. Please don't make me crawl for it, but before you post something else, I'll tell you that if it's in asm, I cannot help you fairly well.
Also, the media used is quite important, as you experienced yourself. Not only the image, but also the model itself.
Beware that any shader will always be "on the object surface", and tht an effect surrounding the object will be hard and expensive.
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 5th Jan 2006 23:13
Ah right I forgot you mentioned render states earlier. So where would I post this...



Also, for the uvs, youre saying I need to grab input from the different stages from my input struct? So add a line like...

float2 uv1: TEXCOORD1;
float2 uv2: TEXCOORD2;
float2 uv3: TEXCOORD3;

But wait... you said texcoords have nothing to do with uv data, that they just represented a 'pixel offset' or some such thing... now im really confused

All you need is zeal
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 7th Jan 2006 01:42
Well just for the heck of it, I thought id try it. Sure enough, it WORKS! If I add a output to my VS that returns 4 different "TEXCOORDS" (0-3), each scrolling at a different speed, then INPUT those same texcoords to my PS, I just have to do this...

color += tex2D( texsamp, IN.uv(example 0-3) ) * weight;

I get JUST the effect im looking for, each noise octave now scrolls at its own speed, giving gorgeous cloud animations (and I can still control the overall density with 'cloudcover')

HOWEVER, this means everything you said before is incorrect (or perhaps I just didnt understand it). You said texcoords have nothing to do with UV data, that they instead represent that 3x3 'displacement chart' you mentioned. Can anyone clarify?

All you need is zeal
EVOLVED0
19
Years of Service
User Offline
Joined: 1st Mar 2005
Location: UK
Posted: 7th Jan 2006 02:45 Edited at: 7th Jan 2006 02:56
texcoord(0-7) = uv data (fvf has a max of 8 texture stages per vertex), u can just munupulte them to outputing any data etc(but below ps2.0 0-5 will only be recognize). colour(0-1) is just vertex colouring , the ps will recognize this but it will turns out vertex based(?) opposed to point pixel accuracy. and i dont think u can alpha individual vertexes as alphering is done through the alpha buffers. x,y,z,a , when outputing colours the alpha stand for final colour output it dose not mean "ghosting", I think thats what u were asking a few messages back?. well thats as far as i know..


edit:


heres a better way of doing perpixel lighting done completely in the ps, if any one wants it.

Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 7th Jan 2006 06:33 Edited at: 7th Jan 2006 08:33
Akk theres no way to do vertex alpha? DBPro can do it easily (you just set the alpha channel in the vertex color), but like I said, when I apply even a simple PIXEL shader (no vertex shader) ALL vertex color/alpha is lost.

There has to be a way to get it to work...

*Edit Quick sidenote question. Im adding a second layer to my clouds, but im having trouble blending the two layers together. What are my options for adding colors besides color1 += color2; color1 /= 2.0?

All you need is zeal
EVOLVED0
19
Years of Service
User Offline
Joined: 1st Mar 2005
Location: UK
Posted: 7th Jan 2006 17:23 Edited at: 7th Jan 2006 17:27
lerp( color0 , color1 , 0.5); ?

i did a quick test on changing the alpha amount , but it seems to do nothing. im sure i read somewhere that inverting the alpha channel will give you a negative alpha blending. other than that im not sure.

hlsl:


dbp:
Catalyst
20
Years of Service
User Offline
Joined: 6th Sep 2003
Location:
Posted: 7th Jan 2006 21:25
@Morcilla

It is in ASM, but I'll post in anyway.



The model isn't to terribly important, as anything that is more than just simple objects will show the issue, and the images the shader uses aren't that important since you will have the effect as shown in the images, or just a thin line as I had said. Now, I really don't know how to write shaders at all, but looking at this one it seems to be getting the dot product of vertex to camera so it can see which ones are at an extreme angle and start using the edge image to make those black. Would it be possible to offset them slightly along their normal? The black would still be cutting into the image, but you could use a thinner line and have it be more selective as to which vertices to affect, and with the edge verts being stretched out a little the inking would look wider.

There's also this other shader:
http://www.gamedev.net/reference/programming/features/cartoon/page2.asp

That one there looks to be using a pixel shader to get the stroke on the outside of the object. Looks quite nice...looks quite slow too.

Any thoughts anyone? Any other options?
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 7th Jan 2006 23:44 Edited at: 8th Jan 2006 04:35
Hmm thanks evolved ill try using a negative alpha when I get home (all the other stuff in your example doesnt make much sense to me).

As for color blending, I dont want to lerp. I want to combine two textures, as if there are two layers of clouds, one ontop of the other. If I just ADD the two colors together, wouldnt I need to do some kind of clamping or something, to ensure the colors never went out of bounds?

* Edit - Unrelated follow up question. Ive always wanted to know how to have a effect be local to a certain radius around the camer. For example, in for my terrain shader, I only want to apply the effect if pixels are a certain distance from the camera. Sort of a fog effect that fades things out, so you can gradually blend to a 'less complex' shader.

I imagine alls you need is to know the distance from camera to pixel (if such a thing is pissible), how would this be done?

And the REAL question, due to mipmapping, would this even be worth it? I mean doesnt a complex pixel shader run FASTER on objects farther form the camera (due to the reduced pixel count)?

All you need is zeal
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 8th Jan 2006 07:34
Well I had a chance to test you code evolved, and I cant see anything going on. Maybe its the texture that im using (just a basic 32x32 no alpha png), but I see no effect in action at all. AND I tryed using a negative value for my alpha, also no transparency.

How in the hell do you apply alpha to these verts?

All you need is zeal
John H
Retired Moderator
21
Years of Service
User Offline
Joined: 14th Oct 2002
Location: Burlington, VT
Posted: 8th Jan 2006 08:09
You can apply alpha to verts in DBP multiple ways. I'd suggest using the new (well, 5.8 new) vertex commands

http://darkbasicpro.thegamecreators.com/?f=upgrade_5_8

Check em out.


Join Our Forums and get game updates faster!
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 8th Jan 2006 21:34
Doesnt work. The shader overwrites the vertex color (even a PIXEL shader with no vertex shader).

All you need is zeal
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 11th Jan 2006 02:37
Something I have always wondered... a vertex shader runs for a single vertex, then outputs that info to a pixel shader. The pixel shader bases its output on that vertex input. Well, how can that work?

I mean, it LOOKS like a single vertex is the onlything the pixel shader sees. Or is the vertex shader ran for every single vertex, and then a 'super' output is created, AFTER that the pixel shader is ran (so it knows about all the verts)?

Of all the books/papers I have read, ive yet to hear a good explanation for the ORDER in which data flows inside the shader.

All you need is zeal
Red Ocktober
20
Years of Service
User Offline
Joined: 6th Dec 2003
Location:
Posted: 14th Jan 2006 22:14
can i jump topics for a sec...

Evolved's Water Shader... is there a way to get it to run on anything less than ps/vs 2.0...

... ps/vs 1.4 maybe?

or is there a water shader running around that will run in ps/vs 1.4...

thx

--Mike
Ninja Matt
19
Years of Service
User Offline
Joined: 5th Jun 2004
Location: Lincolnshire, UK
Posted: 15th Jan 2006 02:13 Edited at: 15th Jan 2006 02:24
*climbs out of hole*

Hmm, much confusion over alpha channels. I haven't used them much, but I'll see if I can help.

Alpha blending, in all instances, needs to be done on a per-pixel basis, since that's how it fits into the pipeline. Vertices are there to define triangles for the rasteriser to fill with pixels and, as such, aren't actually rendered themselves. Even if they were, vertex-based alpha blending would basically just end up with you having a bunch on transparent, vertex-sized dots scgttered over your model, right? (That's not an SP there - it won't let me use an A!)

It's not just alpha-blending, either - UV mapping, lighting, texture lookups, whatever, they all happen on a per-pixel basis for the same reason. Even when you actually calculate something in the vertex shader, it ultimately ends up on the screen as a pixel, although sometimes with very little work from the pixel shader.

That's confusing, right? Well, shaders are only a small part of the entire graphics pipeline, and they're not even next to each other, either. The vertex shader comes in pretty soon, after some boring memory-related stuff, and builds your scene for you. Once it's done, you'll have a big set of transformed vertices, complete with all sorts of extra data like UV maps, colours and, of course, alpha.

Prior to the pixel shader, some of these vertices will get discarded or changed, usually by the view frustrum, which is basically your field of view. No point rendering stuff that won't be seen, after all.

Another important step between the two shader units is interpolation, and is how the vertex shader's output is heard by the pixel shader. Basically, consider a pixel anywhere on a polygon. To get the alpha value of that pixel, you need to linearly interpolate the alpha values of the vertices across the polygon. It's easier to understand the concept if you consider a point on a line being rasterised: let's say the left vertex of the line has an alpha of 0% while the right has 100%. A pixel rasterised half way along the line would have an alpha of 50%, due to the linear interpolation. It's a pretty simple idea, but I haven't described it well, so I can look for a diagram if nobody understands me.

This linear interpolation's been around for years, long before programmable shaders! It's the main basis behind smooth Gouraud lighting, which calculates illumination on a per-vertex basis and then interpolates it across the surface. All data is passed into the pixel shader in this way, even UV coordinates.

Once the pixel shader's done it's stuff, you start to get a decent looking scene - everything's coloured, lit, wavy, shiny, bumpy, whatever your pixel shader's meant to do to it. The only things left after the pixel shader are depth tests and alpha blending, which finally make everything look how it's meant to.

So, what have we learnt? Well, it's all very complicated and most people don't really know how it works. I don't either, really - this is all coming from how I interpret everything, and I could be wrong on some of the technicalities. It's also worth noting that using vertex-based alpha isn't actually much quicker than per-pixel alpha, and may actually be slower in some cases, such as when you use the alpha channel of the diffuse texture.

If you're having trouble understanding any of this, let me know and I'll try to be a bit clearer!

-----

Anyway, now for something a bit more practical.

If you've encoded the alpha in the vertex, I'd assume that it's actually stored in the alpha channel of the vertex colour, which is usually going to be plain white. So, in ASM, you'd need something like this:

...
dcl_normal v1,
dcl_color v2,
dcl_texcoord v3,
...

And in HLSL, you'll want a line like this in your vertex input structure:

float4 VertColour : COLOR[0]

That might not be exactly right, but it's something like that! You'll then need to pass this information down to the pixel shader, remembering that the alpha should be in the .w component. Once you've got it into the pixel shader, you'll have to stuff it into the .a component of the output colour, otherwise it probably won't work as you'd like. Finally, don't forget to set a blend mode, either in DB (the actual blending is done independently of the shader, remember) or in the shader itself.

That should be it! Probably. I haven't tested it, but I'd expect that this is how it works, so I'm hopefully not too far off!

-----

Okay, here's some other quick notes that you might find helpful, just to answer some of the questions floating about:

Modern 3D cards ALWAYS need both shaders. They simply won't do without them, since they're such important parts of the pipeline. If you don't explicitly give the hardware a shader to use, it'll either be grumpy and not work or substitute one of it's own stock shaders. Think you're running your game without touching shaders? Think again - the old fixed-function pipeline doesn't exist anymore, as it's now fully emulated by the shaders.

Pixel shaders operate one or more times (depending on passes) on every pixel. Moving an object farther from the camera doesn't speed up the vertex shader - it's still got the same number of vertices to process. The pixel shader, on the other hand, has less pixels to work with, since the object is now smaller on the screen, so runs faster.

Prior to shader model 3 (I think), shaders were fixed in size - no loops, no branching, even the texkill instruction didn't stop the pixel shader from executing! If you want to scale the shader performance, say on distance from the viewer, you've got to do it in DB with multiple loaded shaders, not from within the shader itself.

The pixel shader operates on one pixel. Just one. It sounds obvious, but it's pretty hard to come to terms with (it was for me, anyway!). How does blurring work, then? What about edge detection? Both of these need sample data from neighboring pixels, right? They do, and we provide them with extra data by giving them extra textures. To make a x4 sample blur, take one texture, apply it to four texture stages, and read them into the pixel shader so it can take an average.

Is that HLSL you spied above? Indeed it is, since I'm finally getting around to learning it! I've done a couple of basic things, and now I'm going to try my hand at something a bit fancier. Looks like there's demand for a rain shader, so I'll probably give that a go.

Thanks for reading!
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 15th Jan 2006 10:02 Edited at: 15th Jan 2006 10:17
Welcome back ninja! Some time ago you helped me out with a cool terrain splatting shader. Im back, and havent stopped working on my project. Im getting a little better with hlsl and have implemented my own per pixel splatting shader for my terrain (hope to implement normal maps soon too).

Anyway, I was asking all the vertex alpha questions in regards to my cloud shader. Before implementing the shader, my clouds were a simple dome, with invisible verts along the edge (so the clouds would fade out into the distance). It worked great. However now that ive applied my shader, all vertex alpha has been lost.

If I understand what you wrote, youre saying that there is no such thing as vertex color in shaders? That they simply provide a 'skeleton' for the pixel shader? I hope thats not true... How can it be faster to calculate alpha on a per pixel level? I would have to do a expensive distance check on each pixel, rather than each vert. How else could I have things fade out in a radius around the camera? Please explain!

*Edit - Follow up question - You mentioned you had trouble understanding the fact that a pixel shader was run for one pixel, well IM having trouble understanding how many times a vertex shader is being run. I mean, lets say I want to grab the color of a pixel based on some UV data. In my vertex shader I might be doing all kinds of crazy things to the uv data (scrolling it, ect...), yet the pixel shader seems to magically get the correct color based on ONE uv coord!? Heres an example...

My vertex output may include

float2 uv: TEXCOORD0;

FROM THAT (2 floats) how in the hell can the pixel shader know what color to return when I call this...

float4 color = tex2D( tex1samp, IN.uv );

*head explodes*

All you need is zeal
Ninja Matt
19
Years of Service
User Offline
Joined: 5th Jun 2004
Location: Lincolnshire, UK
Posted: 15th Jan 2006 23:53 Edited at: 15th Jan 2006 23:56
Ah, now realtime calculation is a different story. When you're using pre-computed alpha values, storing them in the vertex colour isn't really any faster than using a texture - the only difference is that you need extra memory for the texture.

When it comes to calculating the alpha in the shader itself, it's nearly always faster to do it in the vertex shader. You'll then pass the alpha on to the pixel shader, which should then be able to output the interpolated alpha value for each pixel. More ofthen than not, moving the alpha value to the final output won't slow the shader down, since it'll be tied to one of the other inputs, such as lighting or one of the textures.

Basically, to see your alpha channel in effect, it has to be part of the final pixel shader output. Doesn't matter how it gets there or where it's calculated, just providing it's there when the shader finishes!

Usually, each vertex is put through the vertex shader once, unless you've got extra passes or something like that in effect. All vertices in the scene get this treatment, in order to build the screen-space skeleton you described (it probably has a proper name, but I've no idea what it is!).

The important thing to note is that this happens to every vertex long before the pixel shader comes into play. Once you've got the skeleton, the rasteriser begins to turn it from mathematically-defined geometry into the discrete pixels needed for display. By the time the pixel shader comes around, the interpolation whatnot has given the pixel all the input the shader needs.

So, in answer to your question: the pixel shader doesn't magically know what to do just by looking at the UV coords of one vertex - the UV coords of all three vertices have already been interpolated and the result buffered, ready for the pixel shader to use.

I know it's complicated, and it doesn't always make a lot of sense to me, either! Fortunately, you can probably get away without knowing how most of the stuff behind the scenes works, just that it does work!

Still, if you want to delve a bit futher into it, you could try this link:

http://www.gamasutra.com/features/20030514/fosner_pfv.htm

It's got a pretty good diagram that shows how everything fits together, without getting stupidly complicated like most pipeline diagrams do.
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 16th Jan 2006 02:15 Edited at: 16th Jan 2006 02:54
Thanks ninja, that was one of the better explanations I have read. So let me see if I got this right...

---

1.) sync is called

2.) all objects (limbs aka meshes aka groups of verts) are culled, and objects still in view are sent to the vertex shader.

3.) the vertex shader is run for EVERY vertex, and a 'master' vertex output is created.

4.) this master output is input for the pixel shader, which then gets a single 'bit' of data for each pixel (a 2 float uv is actually a interpolated value based on the the master vertex output).

5.) the pixel shader outputs a color to the screen.

---

close? When you look at a fx file through the eyes of a novice programmer its hard to see this order and looping. Up till now I always read a pixel shader like any other program, line by line, top to bottom. Obviously I knew that was wrong, but there is ZERO documention for how data flows to/from the shader.
-

On a unrelated note, I still have some questions about vertex alpha. According to that article you linked, the ONLY input to a pixel shader is really UV data. Can you even import vertex color to a pixel shader? If you could, then you could (seeing as how youd be getting a interpolated value) use that color to scale your pixel alpha.

*edit hmm wait it does say you can import uv AND color to a pixel shader... maybe this wont be as hard as I thougth. I can just calculate distance in my vertex shader, output the appropriate color, then import a (interpolated) color to the pixel shader, and base the final PIXEL color on the inputed vertex color...

Dang ill bet this would work to get my terrain shader to fade out in a radius around the camera too...

All you need is zeal
Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 16th Jan 2006 07:38
Yup, ive solved my alpha problem. All I had to do was add COLOR to my vertex input, send the color on its way to the output (since the mesh already had the proper colors), then input the color into my pixel shader. At that point all I had to do was multiple every pixel color by the 'vertex' color (I imagine that is pretty slow ), and everything now works as it should. The clouds fade out into the horizon.

So, my next task (for my terrain shader) is to figure out how to calculate a 'distance from camera' value for every vert, then pass that info to the pixel shader. The distance will determine how advanced the output color will be (if normal maps are to be calculated or if it should use vertex lighting, ect...). Whats the best way to do this? Is there no faster way to get vertex distance from the camera than the old sqrt() method?

All you need is zeal
Catalyst
20
Years of Service
User Offline
Joined: 6th Sep 2003
Location:
Posted: 16th Jan 2006 22:41
Hey look, everybody's favourite Ninja is back!

So, after digging through the forums/internet for a nice way of doing a stroke, I decided that the only ways I was seeing to do it nice would be too long to render and ended up finding that PrestonC made a cel shader that did a very quick way of making the stroke...very slight modification and it gives me about the effect I'm looking for. Thank you PrestonC!

Now, on to the next shader I'd like to try using. RenderMonkey has an excellent sky effect called CloudsEffect_ASM. Here it is:



Now, there's probably multiple issues with getting this one working in DBPro easily, but the one thing that it seems was an issue was the object itself. I tried converting the object into a .X and telling it where to load it, doing a checklist for effect values would usually give a 0, but after the changes it gave a 1. But it made the object completely disappear. Since this one seems to be locking itself to the camera, I don't really know if I should see the object at all when the shader is applied. One way or another, does anyone know how to make this shader work in DBPro? I'm trying to learn about shaders, so if you can explain why it wasn't working as well, that would be great.
Vector ScOpE
18
Years of Service
User Offline
Joined: 4th Nov 2005
Location: middle world UK
Posted: 17th Jan 2006 15:34 Edited at: 17th Jan 2006 15:36
Calling the Shader experts i found this cool Normal Map shader on my travels any chance we could get this working in DBpro im no expert on these
the source to the shader is attached
the link to the site i found it is here
http://www.monitorstudios.com/bcloward/shaders_NormalMapSpecular3lights.html

and some cool pics of what it can do





Nothing is foolproof to a sufficiently talented fool
Catalyst
20
Years of Service
User Offline
Joined: 6th Sep 2003
Location:
Posted: 18th Jan 2006 05:03
Just dig back towards the beginning, Ninja Matt already posted a working normal map shader capable of doing multiple lights with specularity.
re faze
19
Years of Service
User Offline
Joined: 24th Sep 2004
Location: The shores of hell.
Posted: 18th Jan 2006 06:31
@ninja matt
why dont you work for tgc? there was a freelance sticky around here somewhere.

Zealous
19
Years of Service
User Offline
Joined: 13th Sep 2004
Location: Colorado Springs
Posted: 18th Jan 2006 08:23
Looks like theres a built in function called distance() (its prolly just using sqrt() but oh well). Is there anyway to see how hard my gpu is working? Is it always going to be 100% regardless of how many shaders are running on it? I have a ATI Radeon 9800 Pro.

And for a new question, how would you LAYER two scrolling textures in a shader? Imagine two cloud layers (with patches of alpha) scrolling at different speeds. Youd think it would be easy, but I cant get it to look right. Lerping or adding the two colors doesnt work, am I missing something? Would it be easier to just use two seperate objects, one ontop of the other?

All you need is zeal

Login to post a reply

Server time is: 2024-04-27 04:15:47
Your offset time is: 2024-04-27 04:15:47