Record expression values as keyframes
  • "Initials" by "Florian Körner", licensed under "CC0 1.0". / Remix of the original. - Created with dicebear.comInitialsFlorian Körnerhttps://github.com/dicebear/dicebearTD
    TDSOJohn
    2mo ago 100%

    Thanks for the detailed explanation, not the answer I hoped for but definitely the answer I needed! Will look into all 3 options, thanks again!

    2
  • Record expression values as keyframes
  • "Initials" by "Florian Körner", licensed under "CC0 1.0". / Remix of the original. - Created with dicebear.comInitialsFlorian Körnerhttps://github.com/dicebear/dicebearTD
    TDSOJohn
    2mo ago 100%

    First of all thank you for your answer! What I meant with animation is the light color modulation itself, so I would basically need to listen to incoming driver data and record it to play it afterwards as an animation (yes, we would export to usdz in order to import in reality composer).

    2
  • Good morning, Hope this is not a stupid question, I am very new to Blender. So, my setup is: - 3d env built from iPad photogrammetry - we insert some lasers (a simple cylinder with emission node) - we control the lasers using QLC+ --> artnet --> BlenderDMX and a python expression that modulates the emission color for every laser from a separate dmx channel. We would now love to be able to store the dmx animation directly in blender as keyframes in order to export the animation and put it back on the iPad for AR simulation. Is there any way to record the driver data in real time?

    19
    4
    Question about Blender properties
  • "Initials" by "Florian Körner", licensed under "CC0 1.0". / Remix of the original. - Created with dicebear.comInitialsFlorian Körnerhttps://github.com/dicebear/dicebearTD
    TDSOJohn
    7mo ago 100%

    Thanks, I already solved it! I didn’t know that Drivers exist. The solution was to create a driver for each of the rgb color coordinates of the emitter, connect to a dmx channel and divide the value by 255.

    I then connected to qlc+ using a virtual dmx channel and could control everything 😌

    2
  • Hello everyone, I have a question about Blender properties. I'm reading BlenderDMX documentation and at page [BlenderDMX docs](https://blenderdmx.eu/docs/dmx/#blenderdmx-dmx-driver-for-blender) paragraph "BlenderDMX DMX driver for Blender" it says that I can use DMX signals to control any Blender property. I also have modelled some lasers and would love to control some of the nodes' properties (Emission Color mostly) with DMX for a simulation. Does anyone know if that's possible and how to do it? A search for "blender properties" and "blender property keywords" gave me nothing.

    10
    2

    Hi everyone, first of all I'm so happy to have found a Processing instance here on Lemmy! I'm doing a Processing project where I have ~100 instances of people, all coming from the same 3d model. I'm currently storing all the PShapes inside a 1D array and doing all the drawing inside the `draw()` function. Now, I would love to put everything that concerns the person inside a class. If I create something like: ``` class Person { PShape person_obj = loadShape("path/to/shape.obj"); Person() {} } ``` does Processing automatically load only a single model or do I have 100 models in RAM? If the answer is the latter, I tried changing `PShape` to a static variable but `loadShape()` is not static and everything results in an error. Processing documentation about `explicit` simply says "yeah it's a java language feature, study it if you need it" which makes sense. So, I started looking for static usages in Java and tumbled upon this StackOverflow post, that basically says to use a context class with all the 3d objects I statically need and pass is to everything else that needs the 3d models. (link: https://stackoverflow.com/questions/4969171/cannot-make-a-static-reference-to-the-non-static-method ) Does anyone know if it's the correct approach for Processing?

    4
    3