A list of puns related to "Vertex Shader"
Hello there, Iβve got a vertex shader which generates a height map but I need to calculate the correct normals, donβt have a clue where to begin with it anyone able to walk me through it?
Hello, I was trying to follow a tutorial for making a foliage shader. I was recreating the shader in Shader Graph instead of amplify and rn I'm stuck because I dont know how to make a vertex offset. I'll post the pic of mine and what I was following, can anyone tell me what I am missing? Edit: one kind man helped me with the offset but probably there is a problem with how I did the graph.
I'm learning from https://learnopengl.com/Getting-started/Textures and am currently on texture chapter. I can't seem to understand why we're passing texture coordinates to vertex shader and then from there to fragment shader?
Can't the fragment shader directly read my texture coordinates like the following:
layout (location = 2) in vec2 textcoord;
Or can I use uniforms to bypass this shit?
EDIT: Here's the code for both of them:
#version 330 core
layout (location = 0) in vec3 aPos;
layout (location = 2) in vec2 atextcoord;
out vec2 textcoord;
void main()
{
gl_Position = vec4(aPos, 1.0);
textcoord = atextcoord;
}
#version 330 core
in vec2 textcoord;
out vec4 FragColor;
uniform sampler2D ourTexture;
void main()
{
FragColor = texture(ourTexture, textcoord);
}
EDIT: Thank you everyone for your response. Now I finally understand this pipeline shit. The texture coordinates received by the fragment shader is for each pixel and way more than just a few that I've passed to the program and hence makes no sense to read it directly.
I have got another question then: is it possible to print the values received by the shaders? Or atleast pass it to my program so that I can atleast error check them?
Hi,
Am not sure if i am missing a fundamental concept but it honestly never made sense to me to treat the Vertex Shader as a "Material", let alone combine it with the Fragement Shader in the same program.
Cause at the end of the day one is modifying pixels, and the other vertices, why are they merged together ?
Imagine a scenario where you need a vertex shader to achieve snow trail for example, wouldn't it be better if you could just write a shader that modify a vertex position based on whatever rule you set (texture, or world space vector) and that's it, now you have the base "leave a mark" effect.
And then you can create a separate fragment shader and you can have one for snow, one for sand, one for goo, one water, etc... and all that this Shader care about is basically the final color (light, shadow, smoothness, opacity, etc... at the end of the day they are all colors) without having it over-bloated with settings from the vertex shader or settings to "consider" alternative scenarios (one shader for Snow and Sand and Mud)
And for a more practical example, which I just encountered, I wanted to use the "Boing Kit" which some of it is CPU based (just attach a component) but other features are vertex-shader, well, now if you're game uses a custom shader to look original, you'd have to find a way to include the Boing Kit vertex shader into your already made custom shader, now the boing kit make it easy to do that , but this seems a bit weird to me tbh, and a waste of time on something that shouldn be done in a better way.
So can a Shader expert clarify why is this happening ? after all in all Shader languages and Editors, there is a clear distinguish between the Fragement program and the Vertex Program, so why don't game engine and even 3D softwares like Blender, create a higher level separation so that you can create a Material to modify color and light reaction (fragment), and a Material to modify the shape (vertex), and if the GPU must run them both as a single program to reduce drawcalls per object or something, then just merge them before compiling, is there a particular technica reason why this isn't a thing (yet) ?
Thank you
my issue is simple
void vertexDataFunc(Β inout appdata_fullΒ v,Β out InputΒ oΒ )
{
_PropertyΒ =Β 1; works
_PropertyΒ +=Β 0.1; doesnt
}
second example either immediately maxes the value (no, changing it to 0.00000001 wont help) or just does single iteration as if it wasnt saving the value.
why is that?
I am trying to write a shader where i want to access other vertices but I could not figure out how to do it and could not find a answer after googling.
Something like:
void vertex()
{
vec3 otherVertext = VERTEX.getAdjecent(1,1);
}
I'm trying to make my own 3d engine with love2d. Everything has been going really well until i wanted to implement shadows into my engine. I have no idea where to even begin. Can someone please help me with this?
My project is going dandy, but I'm now trying to optimize it, so that I can get some good FPS. I'm using compute shaders for marching cubes (alongside the new mesh API to pass the triangles indirectly) alongside an octree implementation to get good FPS, but I'm noticing that my program seems to be GPU bound, as when generating worlds with size of about 8000x8000x8000 on somewhat low level of detail, I get more than 4m+ vertices.
My implementation of marching cubes does not have vertex sharing, so I'm wondering if that's possible with compute shaders? Also, if anyone has more tips, that would greatly appreciated! I can give more info too.
Basically the title. Is there a way to remove the position stream or do something about it?
This is a very silly question, but I can't figure out.
Why this don't work?
shader_type canvas_item;
uniform float force = 10.0;
void vertex(){
VERTEX.x += sin(TIME)*force * VERTEX.y;
}
As far as I "know", if VERTEX
has those positions (that they seem to have when I ""debug"" them with COLOR
):
https://preview.redd.it/80nc6it34eb71.png?width=288&format=png&auto=webp&s=ca439551eaaa4138e8d7248cf89d571815835fff
The VERTEX
will only be displaced on the bottom vertices (since VERTEX.y
is (?) 0 and it makes 0 the displacement).
But the ressult is all vertices osciling like:
https://preview.redd.it/k0o1h3s36eb71.png?width=458&format=png&auto=webp&s=b92248ccb7aa1162e9cc2dd4deae1654083b59bb
Please, can someone give me an explanation why this is happening and dont work as I said?
(BTW, I know I could solve the problem with UV, I'm just trying to fully understand (I don't care about the effect))
Hey there. I am having issues rendering my Vertex Shader. I want it to have the same behaviour as if it was passed with no effect and add my own logic on top of it. I tried to use the default .fx file that is provided with the content pipeline but it just shows nothing, even after setting the mvp field. I think it should help if I knew how to get the position passed in spriteBatch.Draw() in my shader, but I'm probably just missing something. Can you help despite my newbie question ? Thanks !
I am looking at a sample code from a respected source and the mipmap level-of-detail parameter (float) is passed to the vertex shader as a uniform which then passes to the fragment shader without using the flat keyword, thus subject to interpolation(but all the vertex shaders are providing the same value-what would be the point of interpolation). What could I be missing here?
Is it not better to pass a such a variable as a uniform to the fragment shader?
Greetings
How do I convert a vertex WordPosition to UVPosition inside a shader ????
v2f vert (appdata v)
{
v2f o;
o.worldPos = mul (unity_ObjectToWorld, v.vertex);
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
return o;
}
fixed4 frag (v2f i) : SV_Target
{
// sample the texture
fixed4 col = tex2D(_MainTex, i.uv);
float2 uvPos = ???????????????
return col;
}
Thank you.
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.