blender | A modular orchestration engine | Graphics library
kandi X-RAY | blender Summary
kandi X-RAY | blender Summary
Blender is a modular remote command execution framework. Blender provides few basic primitives to automate across server workflows. Workflows can be expressed in plain ruby DSL and executed using the CLI. Following is an example of a simple blender script that will update the package index of three ubuntu servers.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Run the job
- Queries the schedule .
- Run a job
- Loads a configuration file from the given arguments .
- Runs the given block .
- Create a new lock .
- Returns a new instance .
- Append a task
- Set the schedule
- Define an event handler
blender Key Features
blender Examples and Code Snippets
Community Discussions
Trending Discussions on blender
QUESTION
I am trying to capture STDOUT to a file with Capture::Tiny.
To give you the bigger picture, my end-goal is to:
- run a Blender process on the command line
- capture the log to a filehandle
- watch the filehandle "grow"
- pipe the content of the filehandle to a websocket for the user to watch progress via a web page
My plan is to use Capture::Tiny, and the example provided in this thread.
However I am stuck at step 1: I can capture STDOUT and STDERR as expected like this:
...ANSWER
Answered 2022-Mar-23 at 14:33system
(or qx
) run the command to completion and then return. You need to run the command asynchronously. IPC::Run is my preferred solution for this type of interaction.
QUESTION
My goal is to make a ROBLOX like customization system for my game, where a user can choose between hundreds of pieces of clothing and accessories for their avatar. Now, I would like to use Mixamo to animate my character, but Mixamo needs a fully boned rig to download the animations. Now that's a problem: I need Blender to bone the rig, but each rig will be different since the user is creating his own character, and if I had all the rigs pre-made, there would be literally millions of combinations of characters!
The alternative, I'm thinking, is to use Tween.js, but that's a really bad way since it'll take a ton of work and time just to get a single animation, and it still won't be as good as Mixamo.
At this point, I'm sure you have no idea what I'm talking about, so here's an image to describe my issue:
I hope the diagram made my problem more clear. Below I've listed some of the possibilities that could potentially solve this problem, but then again, not sure.
- Maybe you can download an animation from Mixamo without a specified rig to it, so you can apply it to any character?
- Maybe I just animate the base rig and then apply the textures for all the body parts (so instead of downloading all the rig possibilities, I just download all of the clothing textures)?
Does anyone know how to do this?
...ANSWER
Answered 2022-Feb-22 at 19:45For anyone else who was wondering the same thing: You should first download the desired animation from Mixamo on the base model (without the customizations). Once you load the animation into three.js, you should be able to see a group, which includes the components from which the model was made. You can then manually apply as many textures as you want to the desired parts of the rig. So, your model will move normally (like the animation), while having your character exactly how you or the user wants it. That should do it!
QUESTION
I tried to export GLTF model from Blender to Three.js. It works. But I have some artifacts on Three.js with lighting and painting. I have lines and squares in Three.js and I don't know why.
I used only Principled BSDF node in Blender to painting my model. If I set material in Three.js (MeshPhongMaterial) it works fine. But not with Principled BSDF node from Blender. Any ideas?
I'm trying to make the object cast a shadow and react to the lighting. This works well with MeshPhongMaterial and with Principled BSDF. But in the second option, I don't like these black stripes.
...ANSWER
Answered 2022-Mar-14 at 12:12The usual approach to mitigate self-shadowing artifacts is to modulate the bias
property of your shadow-casting light. Try it with:
QUESTION
I'm scripting using Python in Blender, a 3D graphics tool. I initially intended to post this topic on the blender stack exchange, but I've come to the conclusion that it's closer to the basics of Python, which led me to write the question here. Please let me know if I'm on the wrong forum!
I made the mistake of using the default function name as a variable name in Python some time ago, like list = ['a', 'b', 'c', 'd']
. To take the more extreme case, all the print functions in my code would not have worked if I had declared a variable like print = 'a'
.
So I tried to write a code that prints out all the variable names that are already reserved. Even if I wanted to use list
or print
as the variable names, I would use a different name because I saw True
returned and knew that this was already using these names in __builtins__
. Now that I've tested it in the Python console, I went to a text editor and ran the same code, but this time I got False
instead of True
. What happened?
I used the dir()
function to check the values actually inside __builtins__
. The output from the Python console contained print
and list
in __builtins__
as expected, but the output from the text editor had different values. Upon closer inspection, I noticed that there were values such as keys
, items
, values
, which are methods available on the dictionary!
This time, I used the type()
function to print the type of __builtins__
. The Python console printed and the text editor printed
.
I know that Blender's python console and text editor work separately within different scopes, but I don't know what's going on specifically. Why is the output different for __builtins__
?
ANSWER
Answered 2022-Mar-12 at 17:29You're using different implementations of Python, and the value of __builtins__
is deliberately left underspecified in the docs.
As an implementation detail, most modules have the name
__builtins__
made available as part of their globals. The value of__builtins__
is normally either this module or the value of this module’s__dict__
attribute. Since this is an implementation detail, it may not be used by alternate implementations of Python.
So __builtins__
is simply a convenience mechanism for particular implementations and is generally set to either builtins
or builtins.__dict__
. It looks like your Python console does the former and Blender's version does the latter. To do it in a uniform way across implementations, you can do
QUESTION
ANSWER
Answered 2022-Feb-03 at 22:05Load the texture, apply the texture to the object you want, then add the model to the scene:
QUESTION
I would like to be able to robustly stop a video when the video arrives on some specified frames in order to do oral presentations based on videos made with Blender, Manim...
I'm aware of this question, but the problem is that the video does not stops exactly at the good frame. Sometimes it continues forward for one frame and when I force it to come back to the initial frame we see the video going backward, which is weird. Even worse, if the next frame is completely different (different background...) this will be very visible.
To illustrate my issues, I created a demo project here (just click "next" and see that when the video stops, sometimes it goes backward). The full code is here.
The important part of the code I'm using is:
...ANSWER
Answered 2022-Jan-21 at 19:18The video has frame rate of 25fps, and not 24fps:
After putting the correct value it works ok: demo
The VideoFrame api heavily relies on FPS provided by you. You can find FPS of your videos offline and send as metadata along with stop frames from server.
The site videoplayer.handmadeproductions.de uses window.requestAnimationFrame() to get the callback.
There is a new better alternative to requestAnimationFrame. The requestVideoFrameCallback(), allows us to do per-video-frame operations on video.
The same functionality, you domed in OP, can be achieved like this:
QUESTION
I'm working with a mesh of a cave, and have manually set all the face normals to be 'correct' (all faces facing outside) using Blender (Edit mode-> choose faces -> flip normal). I also visualised the vertex normals in Blender, and they are all pointed outwards all through the surface:
The mesh is then exported as an STL file.
Now, however, when I visualise the same thing in Pyvista with the following code:
...ANSWER
Answered 2022-Jan-27 at 14:38The convenience functions for your case seem a bit too convenient.
What plot_normals()
does under the hood is that it accesses cave.point_normals
, which in turn calls cave.compute_normals()
. The default arguments to compute_normals()
include consistent_normals=True
, which according to the docs does
Enforcement of consistent polygon ordering.
There are some other parameters which hint at potential black magic going on when running this filter (e.g. auto_orient_normals
and non_manifold_ordering
, even though the defaults seem safe).
So what seems to happen is that your mesh (which is non manifold, i.e. it has open edges) breaks the magic that compute_normals
tries to do with the default "enforcement of polygon ordering". Since you already enforced the correct order in Blender, you can tell pyvista (well, VTK) to leave your polygons alone and just compute the normals as they are. This is not possible through plot_normals()
, so you need a bit more work:
QUESTION
I exported a default cube from Blender 3.0 to gltf+bin. I try to draw it in pure WebGL.
It is just a very simple example. You will see magic numbers in this example like:
...ANSWER
Answered 2021-Dec-14 at 09:38The indices appear to be 16-bit integers instead of 8-bit integers:
QUESTION
Here is the face in fbx format that mediapipe uses for their face mesh model. It has 468 vertices. Here is the visualisation of the indices.
Here is the description of mediapipes face mesh model. It outputs landmark positions.
How do I know which landmark belongs to which vertices? For example in blender. When I import the fbx face, how can I get the same indices like the landmarks of the mediapipe face mesh model?
...ANSWER
Answered 2021-Nov-16 at 05:28It seems like indices in the blender with fbx model are same as those provided from mediapipe face mesh solution. These indices are same as those in the mediapipe canonical face model uv visualization. This answer provides example to get a landmark by its index.
Need to have Developer Extras
enabled. In edit mode the option is shown under Viewport Overlays > Developer > Indices
as shown below to get indices in blender. Alternate option to get indices can be found here.
I have shown an example below with left eye landmark indices as they appear in canonical face mesh uv visualization.
Indices Visualization CodeCode based on, https://google.github.io/mediapipe/solutions/face_mesh.html.
QUESTION
I am developing in Python/Blender, and have two needs here:
- Import all the individual classes from my module (because they must each be registered with blender)
- Reload the module itself each time the script is executed (to prevent caching while I'm developing the plugin and press "reload scripts")
Currently I am doing this (in __init__.py
):
ANSWER
Answered 2021-Sep-29 at 23:13Some colleagues helped me with an answer, what ended up working was this in __init__.py
:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
Install blender
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page