pix-plot | A WebGL viewer for UMAP or TSNE-clustered images | Data Visualization library
kandi X-RAY | pix-plot Summary
kandi X-RAY | pix-plot Summary
A WebGL viewer for UMAP or TSNE-clustered images
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of pix-plot
pix-plot Key Features
pix-plot Examples and Code Snippets
Community Discussions
Trending Discussions on pix-plot
QUESTION
Introduction:
I render an isometric map with Three.JS (v95, WebGL Renderer). The map includes many different graphic tilesets. I get the specific tile via a TextureAtlasLoader and it’s position from a JSON. It looks like this:
The problem is that it performs really slow the more tiles I render (I need to render about 120’000 tiles on one map). I can barely move the camera then. I know there are several better approaches than adding every single tile as sprite to the scene. But I’m stuck somehow.
Current extract from the code to create the tiles (it’s in a loop):
...ANSWER
Answered 2018-Sep-07 at 15:46I think Sergiu Paraschiv is on the right track. Try to split your rendering into chunks. This strategy and others are outlined here: Tilemap Performance. Depending on how dynamic your terrain is, these chunks could be bigger or smaller. This way you only have to re-render chunks that have changed. Assuming your terrain doesn't change, you can render the whole terrain to a texture and then you only have to render a single texture per frame, rather than a huge array of them. Take a look at this tutorial on rendering to a texture, it should give you an idea on where to start with rendering your chunks.
QUESTION
I have a Three.js scene with points and am trying to figure out the relationship between my points' positions and screen coordinates. I thought I could use the function @WestLangley provided to a previous question but implementing this function has raised some confusion.
In the scene below, I'm storing the x coordinates of the left and right-most points in world.bb.x
, and am logging the world coordinates of the cursor each time the mouse moves. However, when I mouse to the left and right-most points, the world coordinates do not match the min or max x-coordinate values in world.bb.x
, which is what I expected.
Do others know what I can do to figure out the world coordinates of my cursor at any given time? Any help others can offer is greatly appreciated!
...ANSWER
Answered 2018-Sep-02 at 21:40Aha, instead of dividing the event x and y coordinates by the window width (which only applies to canvases that extend through the full window height and width), I need to divide the event x and y coordinates by the canvas's width and height!
QUESTION
I'm working on a scene in which I'm using Points, InstancedBufferGeometry, and a RawShaderMaterial. I'd like to add raycasting to the scene, such that when a point is clicked I can figure out which point was clicked.
In previous scenes [example], I've been able to determine which point was clicked by accessing the .index
attribute of the match returned by the raycaster.intersectObject()
call. With the geometry and material below, though, the index is always 0
.
Does anyone know how I can determine which of the points was clicked in the scene below? Any help others can offer on this question would be very appreciated.
...ANSWER
Answered 2018-Aug-09 at 16:06One solution is to use the technique sometimes referred to as GPU Picking.
First study https://threejs.org/examples/webgl_interactive_cubes_gpu.html.
Once you understand the concept, study https://threejs.org/examples/webgl_interactive_instances_gpu.html.
Another solution is to replicate on the CPU the instancing logic implemented on the GPU. You would do so in your raycast()
method. Whether it is worth it depends on the complexity of your use case.
three.js r.95
QUESTION
I'm working on a Three.js scene in which I'd like to update some textures after some time. I'm finding that updating the textures is very slow, however, and drags FPS to only 1-2 FPS for several seconds (when updating just a single texture).
Is there anything one can do to expedite texture updates? Any insights others can offer on this question would be very appreciated.
To see this behavior, click the window of the example below. This will load the first texture update (another click will trigger the second texture update). If you try to zoom after one of these clicks, you'll find the screen freezes and the FPS will drop terribly. Does anyone know how to fix this problem?
...ANSWER
Answered 2018-Jul-26 at 15:26Your canvases are 16384 by 16384. That's basically insanely large.
For RGBA format, that is 1073741824 bytes.. a gigabyte of texture data that is getting sent to your GPU from the CPU when you set that texture.needsUpdate = true
You will definitely notice this getting uploaded to the card.
If your use case absolutely requires textures that large.. then you may need to consider doing incremental updates via gl.texSubImage2D, or using a bunch of smaller textures and only updating one of the per frame, or only updating those textures at the start of your app, and not thereafter.
For reference, there are very few cases i've seen where textures > 4k per side are needed.
And that is about 1/16th the size of your textures.
This has nothing to do with three.js btw. It's a fundamental characteristic of GPU/CPU interaction. Uploads and state changes are slow and have to be carefully orchestrated and monitored.
QUESTION
I have a Three.js scene that passes a canvas as a uniform to a RawShaderMaterial. After the initial scene renders, I alter the canvas (in the case below, I just paint the canvas red). I then indicate that the shaderMaterial .needsUpdate = true;
but no color appears on the points.
Color does appear on the points if I move the ctx.fill();
loop above the var material =
declaration.
Does anyone know what one must do to update a canvas after the initial render when using the RawShaderMaterial? Any help others can offer would be hugely helpful!
...ANSWER
Answered 2018-Jul-26 at 14:03Ah, one must reassign the value of the uniform (stored in the material) and then mark the material as needing an update:
QUESTION
I have a Three.js scene that passes a few attributes to a RawShaderMaterial
. After the initial render, I'd like to update some attributes, but haven't been able to figure out how to do so.
Here's a sample scene (fiddle):
...ANSWER
Answered 2018-Jul-25 at 22:56It seems to update a buffer attribute (or indexed buffer attribute) one can set the .dynamic
property of that buffer to true
, manually mutate the buffer, then set the .needsUpdate
attribute of the buffer to true
.
In this updated demo, after clicking the scene, the buffer attribute texIdx
is updated such that all points have texIdx
== 0:
QUESTION
I am working on a Three.js scene in which I'd like to use many points with different textures. However, so far I haven't been able to change textures in my fragment shader. Despite the fact that the points in the following scene should alternate between texture 0 (a cat) and 1 (a dog), all points show the same texture:
...ANSWER
Answered 2018-Jul-10 at 20:18In case others end up here facing the same trouble I had, I wanted to say that the above example was a simplified version of a larger scene. In that other scene, I was passing in an attribute texture
but I was initializing the attribute as textureIndex
inside the vertex shader. I then passed the textureIndex to the fragment shader as a varying, where it always equalled 0. So the moral of the story is -- if you try to read from an attribute that isn't passed to the vertex shader, that value evidently equates to zero. It would be great if this threw an error instead.
In the silly example above I had commented out the varying declaration in the fragment shader. This is fixed:
QUESTION
I am working on a scene (easier to see on bl.ocks.org than below) in Three.js that would benefit greatly from using points as the rendering primitive, as there are ~200,000 quads to represent, and representing those quads with points would take 4 times fewer vertices, which means fewer draw calls and higher FPS.
Now I'm trying to make the points larger as the camera gets closer to a given point. If you zoom straight in within the scene below, you should see this works fine. But if you drag the camera to the left or right, the points gradually grow smaller, despite the fact that the camera's proximity to the points seems constant. My intuition says the problem must be in the vertex shader, which sets gl_PointSize
:
ANSWER
Answered 2018-Jul-06 at 16:43This is due to the distortion caused by rectilinear projection, which takes the depth value of each point (the "proximity distance" you were referring to) to be its Z (or Y) coordinate in camera space; you are instead taking this to be the Euclidean distance from the camera's position.
The diagram illustrates that, as you move the camera's axis away from the point, the distance delta
increases (the point shrinks) while the depth value (the "apparent proximity") stays constant. To fix this, just replace delta
with the depth value project[2]
(or [1]
- I forgot which one OpenGL uses).
Note that if your modelViewMatrix
is set up correctly according to convention, you shouldn't subtract by cameraPosition
.
QUESTION
Short question: How can I pass a list of textures to shaders and access the nth texture within a fragment shader (where n is a value passed as a varying from the vertex shader)?
Longer question: I'm working on a Three.js scene that represents multiple images. Each image uses one of multiple textures, and each texture is an atlas containing several thumbnails. I'm working on implementing custom shaderMaterial to optimize performance, but am confused on how to use multiple textures in the shaders.
My goal is to pass a list of textures and a number that represents the number of vertices per texture so that I can identify the texture that should be used for each image's vertices/pixels. I thought I could accomplish this by passing the following data:
...ANSWER
Answered 2018-Apr-25 at 00:12You’ve written the vertex shader as if main
is a for loop and it will iterate through all the vertices and update vertexIdx
and textureIdx
as it goes along, but that’s not how shaders work. Shaders run in parallel, processing every vertex at the same time. So you can’t share what the shader computes about one vertex with another vertex.
Use an attribute on the geometry instead:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install pix-plot
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page