ShaderLab | tool designed for cross compiling unity shader | iOS library
kandi X-RAY | ShaderLab Summary
kandi X-RAY | ShaderLab Summary
仿Unity Shaderlab的Shader跨平台编译器。使用DirectX Shader Compiler以及GLSLang编译器作为前端,SPIRV-Cross作为后端,将HLSL以及GLSL编译成DXIL、SPIR-V、HLSL、Metal和GLSL。.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of ShaderLab
ShaderLab Key Features
ShaderLab Examples and Code Snippets
Community Discussions
Trending Discussions on ShaderLab
QUESTION
In a Unity shaderlab shader you can expose shader properties to the material inspector in the editor. This can be done by placing the properties you want to expose in the Properties
section like so
ANSWER
Answered 2020-Mar-28 at 10:01From quick search I've found there's MaterialPropertyDrawer
that can be extended to add custom tags in shader inspectors (ref: https://docs.unity3d.com/ScriptReference/MaterialPropertyDrawer.html).
Thus, you could use Vector
property in shader, create custom attribute, let's say, [ShowAsVector2]
and make MaterialPropertyDrawer
for it, which would only show two input fields, and assign their value to vector's x
and y
values. This would result in shader property written as:
[ShowAsVector2] _Position2D("Position", Vector) = (0, 0, 0, 0)
QUESTION
I have been trying to achieve this effect from Tim-C (which all seems to be outdated, even the fixes posted in the comments) with ShaderGraph (Unity 2019.3.0f3), but as far as I know you can't do that within ShaderGraph, so after looking at this page on the ShaderLab documentation I came up with the following shaders that use a shader graph I made.
Using this shader displays the shader graph completely fine:
...ANSWER
Answered 2019-Dec-19 at 16:50Turns out, LWRP/URP using only the first pass is a “feature”. https://issuetracker.unity3d.com/issues/lwrp-only-first-pass-gets-rendered-when-using-multi-pass-shaders
I will probably get around this by using two rendered meshes layered over each other. One will do the ZWrite (first), and the other will just be the normal shader graph.
QUESTION
I have two shaders, UnlitRGBA and TransparentNoOverlap. They are as follows:
UnlitRGBA.shaderlab
...ANSWER
Answered 2019-May-26 at 22:57Note that the expected result and the setup are slightly ambiguous, so there might be several ways to go.
Simply put: Material with shader A (Unlit) is covered up by B (Transparent) because B is drawn after it. "Queue"="Transparent" is the flag responsible for that effect in B. Shader A doesn't have it so it uses the default "Geometry" queue. (You can read about the index value used under the hood here: https://docs.unity3d.com/Manual/SL-SubShaderTags.html). So in this regard you could specify the queue indices explicitly, and make sure they match your expected behavior.
This disregards another aspect: the ZBuffer, which deals with occlusion of one mesh by another, made so that they are actually not dependent on that draw order. Pixels can be discarded based on a depth test, and depth can be written by object if specified to do so. (https://docs.unity3d.com/Manual/SL-CullAndDepth.html)
Meshes display z-fighting if both of them write and test, and are co-planar (in the same spot) or the buffer has low precision so the values stored make it seems like they are.
Provided that both your meshes need to be occluded and occlude others and themselves, one way would be to set B (Transparent) to use ZTest Less, so that it only appears when A is not present at all, even if drawn after A, and even if co-planar with A.
QUESTION
Update
The main question is: How can I pass the world space vertex positions of the triangle to the surface shader in a Unity shader. As mentioned in a comment it might be possible to pass them from a geometry shader. But I read somewhere that implementing a custom geometry shader overwrites Unitys logic to calculate shadows etc. I would add the triangle information in the Input structure. But before I change my mesh generation logic for it I would like to know if this is feasible. For this solution the vertex positions of the triangle must be constant for every pixel in a triangle and not be interpolated.
This is the original question:
I am writing a surface shader for a triangle mesh. I set a custom vertex attribute with a texture id to every vertex. Now I want the surface shader to apply the texture as seen in the following image. (Note that each color is representing a texture)
In the surface shader I need the 3 vertices that define the triangle and their texture ids. Furthermore I need to position of the pixel I am drawing.
- If all texture ids are the same I pick this texture for all pixels.
- If one or two texture ids differ I calculate the pixels distance to the triangle vertices and pick the texture like seen in the next image:
The surface shader needs to be aware of the pixels triangle. With this logic I should get the shading I am looking for. I am creating my mesh programmatically so I can add the triangle vertices and their texture ids as vertex attributes and pass it to the surface shader.
But I am not sure if this is feasible with how surface/vertex shaders work. Is there a relationship between the vertex and the pixel to get my custom triangle information from? Is there a better way of doing this?
I am using Unitys ShaderLab for my shaders.
...ANSWER
Answered 2018-Oct-05 at 13:32No, you should not be (nor have acceess to) using vertex data in a fragment shader. In a fragment shader you only have access to data about that given pixel, you cannot go back and look at the mesh that formed it (this is the way the pipeline is constructed).
What you can do (and is a common practice) is to bake the data into one of the available channels (i.e. other UV Mapping channels) of the verts within the Vertex Shader. This way the Fragment shader will have access to the value via interpolators
QUESTION
I ported a Plasma ball shader from Shadertoy to Unity as Image Effect which is attached to the camera. It works fine on the Editor and Windows standalone build. It does not work on Android devices. It is flashing blue and black images on Android.
Here is what it looks like in Unity Editor and Windows Build:
Here is what it looks like on Android:
The ported Shader code:
...ANSWER
Answered 2017-Aug-01 at 20:34You need to use the VPOS semantic for positions in the fragment shader for OpenGLES2.
From Unity docs:
A fragment shader can receive position of the pixel being rendered as a special VPOS semantic. This feature only exists starting with shader model 3.0, so the shader needs to have the #pragma target 3.0 compilation directive.
So to get screen space positions:
QUESTION
I'm rather new to shaderlab with unity. I am trying to distort the vertices so that they are push backwards and towards the camera almost like looking at the camera from a 45 degree angle. I am replicating an effect from a game for fun. This is the code used for the effect
ive tried implenting the code into a shader script like so:
...ANSWER
Answered 2017-Jul-06 at 07:32The fourth part is 'w' also called the inverse stretching factor. To convert from vec4 to vec3 it's best to do position.xyz / position.w
and to put it back in a vec4 you can write return fixed4(position, 1)
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install ShaderLab
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page