renderdoc | RenderDoc is a stand-alone graphics debugging tool | Graphics library
kandi X-RAY | renderdoc Summary
kandi X-RAY | renderdoc Summary
RenderDoc is a frame-capture based graphics debugger, currently available for Vulkan, D3D11, D3D12, OpenGL, and OpenGL ES development on Windows 7 - 10, Linux, Android, Stadia, and Nintendo Switch. It is completely open-source under the MIT license. If you have any questions, suggestions or problems or you can create an issue here on github, email me directly or come into IRC or Discord to discuss it. To install on windows run the appropriate installer for your OS (64-bit | 32-bit) or download the portable zip from the builds page. The 64-bit windows build fully supports capturing from 32-bit programs. On linux only 64-bit x86 is supported - there is a precompiled binary tarball available, or your distribution may package it. If not you can build from source.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of renderdoc
renderdoc Key Features
renderdoc Examples and Code Snippets
Community Discussions
Trending Discussions on renderdoc
QUESTION
I am trying to render using the dynamic rendering extension, to this effect I am trying to render just a triangle with these 2 shaders:
...ANSWER
Answered 2022-Feb-21 at 00:37In case someone runs into this problem in the future.
I was trying to render just a single frame (rather than on a loop) so I was not synchronizing objects because I thought it would not be necessary.
Turns out it very much is, so if you are rendering to the swacphain images even if just once, things won't work unless you use the appropriate fences and semaphores.
QUESTION
I have a strange issue, a vulkan application I am writing seemingly runs fine when run from the terminal. But if run from renderdoc an assertion inside the official .hpp header triggers.
Since this only happens if the program is launched with renderdoc I am having a hard time trying to debug it.
Is there a way to get the exact environment configuration renderdoc is using to run the program so that I can replicate the bug?
It is quite bizarre it only happens if the new dynamic rendering extension is active too. If it is not requested renderdoc doesn;t seem to trigger the assertion. But I am on the latest version (1.18).
...ANSWER
Answered 2022-Feb-20 at 08:45If anyone runs into something like this in teh future. The problem was that an old instance of renderdoc was installed in my system, this in turn created conflicts when loading the program onto renderdoc as vulkan wasn't properly configured.
Uninstalling the old version fixed it.
QUESTION
I'm trying to apply a custom effect using the DirectXTK. The effect is supposed to be an "unlit" shader with just one texture. But for some reason, the texture is stretched across the model. I looked in renderdoc and the texturecoordinates appear to be loaded correctly so i'm not sure what's going on.
UnlitEffect.h ...ANSWER
Answered 2022-Feb-14 at 01:59Chuck Walbourn was correct. My issue was that I was normalizing my texture coordinates in the pixel shader.
The correct code is
return BaseColor.Sample(SampleType, vout.TexCoord);
QUESTION
Environment of work: Unity3D 2021.2.7.f1 Direct3D11
I try to make Laplacian Smoothing working on GPU For this case I set up Compute Shader among others VertexBuffer(input Graphics Buffer) and outVertexBuffer( output Grpahics Buffer), unfortunatelly I have a weird problem with storing data into GraphicsBuffer (storing vertices Vector3) which i use as "output" of compute shader.
Assigning of ComputeShader component:
...ANSWER
Answered 2022-Feb-10 at 19:13Try this:
outVertexBuffer = new GraphicsBuffer(GraphicsBuffer.Target.Structured, 5000000, 12);
Notice the changed Target. Graphics buffer needs at least one of the Compute target flags to be able to be bound to a compute kernel. Target.Vertex is not such a flag. You could still add it but in your case it's not needed.
In fact outVertexBuffer
could just be a ComputeBuffer
. No need to use GraphicsBuffer
since you just copy it to the mesh on the CPU side.
EDIT---
Another thing:
SetVertexBufferData()
requires you to configure all vertex attributes first and the data you pass should basically be raw vertex data: positions, normals, uvs (if any) and so on.
If you only want to set vertices perhaps it would be easier to just use Mesh.SetVertices()
.
QUESTION
In 7.5, the Vulkan spec says about vkCmdWaitEvents
The first synchronization scope only includes event signal operations that operate on members of
pEvents
, and the operations that happened-before the event signal operations. Event signal operations performed byvkCmdSetEvent
that occur earlier in submission order are included in the first synchronization scope, if the logically latest pipeline stage in theirstageMask
parameter is logically earlier than or equal to the logically latest pipeline stage insrcStageMask
.
I'm confused by this phrasing. Does this mean the first synchronization scope is the signalling of events that are passed in to pEvents
, plus any events that are submitted earlier and meet the stage mask and submission order requirement, or is it event signals are both passed in and meet the requirement?
In either case, since you can just pass in events with pEvents
, what is srcStageMask
is useful for?
ANSWER
Answered 2022-Jan-25 at 01:10The first synchronization scope only includes event signal operations that operate on members of
pEvents
, and the operations that happened-before the event signal operations.
The first scope of vkCmdWaitEvents
is only the hypothetical signal on the pEvent
(and all the stuff that happens-before it transitively, as would be defined by whatever signaled the event).
Event signal operations performed by
vkCmdSetEvent
that occur earlier in submission order are included in the first synchronization scope, [...]
vkCmdSetEvent
cannot be reordered past vkCmdWaitEvents
by the driver. It would basically be a broken if it did. I.e. if you call:
QUESTION
I am trying to update the value of a uniform variable within my shader using an ImGui slider however the value I pass in is correct on the CPU side but when its at the GPU it becomes a negative value. I give an example of how I am setting this all up where someVal
is the value being passed to the GPU which represents the radius
value in the image I show (Naming was changed to help make things slightly clearer). This is what I am currently doing
ANSWER
Answered 2022-Jan-15 at 19:52I was taking in an array of samples which was set to 64 but in my shader I had set it to 32. This caused my other variables to receive weird values.
QUESTION
I'm a new android application engineer. I'm trying to make a vulkan rendering backend demo which use SurfaceView as a View on android Java code. I made the gles rendering backend demo using GLSurfaceView as well. On application code, I use setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_LANDSCAPE) API set activity from default portrait to landscape. But it doesn't work on SurfaceView while it works on GLSurfaceView.
I use renderdoc capture the rendering result, the image is in landscape layout(the same layout as gles backend). I doubt it was's something wrong with some settings on activity or window, but can't figure it out the root cause. Could somebody help what maybe the problem is?
Here is the Java source code.
...ANSWER
Answered 2021-Dec-27 at 13:30In the tag of AndroidManifest.xml, add the line
android:screenOrientation="portrait"
.
QUESTION
I encounter an issue when I transmit uniform buffer data from CPU to GPU side. It seems something to do with memory alignment between CPU/GPU. I define a uniform buffer object as follows:
...ANSWER
Answered 2021-Dec-27 at 09:12I got the answer on link Vulkan Memory Alignment for Uniform Buffers. It says for each elements on CPU side, we should set alignment like:
- a int/float an alignement of 4
- a vec2 an alignement of 8
- a vec3, vec4, mat4 an alignement of 16
It works after I set as follows:
QUESTION
I'm trying to implement model loading, but I'm stuck with one problem. When i try to draw a mesh (a single textured quad written by hand for test purposes) for some reason duplicated data associated with the first vertex is passed to the vertex shader (RenderDoc screen).
Here is an example using a stripped down version of my class that still exhibits this behaviour:
...ANSWER
Answered 2021-Nov-03 at 15:52In your Mesh::Setup
you have this line at the end:
QUESTION
library:assimp;model:*.fbx
the skeletal animation can be played normally. But the character’s neck is stretched and does not move.
use renderdoc to find that the vertices of the input vertex shader are no problem. But the output vertex shows that the neck is abnormal. i still can't tell where the problem is
the character’s head can’t move and the neck is stretched. i want to know where exactly is wrong? What goes wrong can make the head unable to move?
this is the result when the error occurred:
pass the input to the shader
...ANSWER
Answered 2021-Oct-11 at 16:06I haven't looked through your code in detail, but from skimming over it, it appears that you aren't using aiNode::mTransformation
which you have to in order to get the correct transformation w.r.t. a bone's parent bone. ASSIMP's documentation describes this parameter like follows:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install renderdoc
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page