Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Usage questions/problems #108

Open
Reio22 opened this issue Oct 25, 2019 · 2 comments
Open

Usage questions/problems #108

Reio22 opened this issue Oct 25, 2019 · 2 comments
Assignees

Comments

@Reio22
Copy link

Reio22 commented Oct 25, 2019

Hi,
just discovered this and it seems super useful, thank you for making it!
There are some things that I would love clarification on (sorry but I could not find much documentation):

  1. What exactly does audio input affect?
    only the effect strength of those nodes where I set the numeric drop down to something other than 0?

  2. What does the dropdown menu (0, 1/8, 1/4 ...) on some nodes do?
    my impression is that the dropdown sets the beats and the slider sets the max effect strength. So strength normally is 0 but every n beats, the actual effect strength is set to current audioAmplitude*sliderValue - is that about right?

  3. How to get microphone input?
    if I play a movie/music it is used, but mic input would be more important (or would that be to slow in a live situation?)

  4. What does the bouncy ball at the top tell me?
    is it a default beat for cases when there is no audio input (and irrelevant if there is audio)?

  5. The logo node is not for loading images in general but just for the radiance logo - right?

  6. What is the placeholder node for?

  7. Can I select a range of "useful" frequencies?

  8. How to get video input from other programs or the output of radiance into other programs?

Possible errors (latest 0.6.1 build from AUR):

  • wobsphere does not seem to react to its "speed" dropdown
  • if I select Custom - MPV and leave the Instantiate MPV line blank, then click OK, radiance crashes.
@ervanalb-vs
Copy link

ervanalb-vs commented Oct 27, 2019

  1. Audio input affects all nodes in that it provides a timebase. All effects will progress faster if the music tempo increases. Individual effects may also use aspects of the audio waveform, such the hi/mid/low levels. vu and oscope are examples of patterns that do this.

  2. You are correct about the slider, we call it "intensity." The drop-down selects the frequency. Honestly, it is entirely up to the effect to decide what to do with these numbers, with the only guideline being that an effect at intensity 0 should pass-through the video unchanged (the "identity" property.) Most effects "pulse" when using non-zero frequencies (such as zoomin), but some simply move faster (like wwave.)

  3. Radiance uses the default audio source provided by whatever is controlling sound on your computer. If you're using PulseAudio, you can install pavcucontrol and use that to select an input for Radiance. Currently, you can't select a different source from within Radiance. It seems strange that your computer defaulted to an internal "monitor" source, it might already be using the microphone.

  4. The bouncy ball shows the current tempo, as determined by the beat detection algorithm. I think it just maintains the last tempo if there is no audio.

  5. You can load your own images into Radiance. If you put them in your library folder (~/.local/share/Radiance/Radiance/library/ on my system) then they will show up in the the left menu.

  6. The placeholder node doesn't do anything. It's useful for scripting and using libradiance, which aren't really supported features at the moment. If you enable AutoDJ from the right menu, you'll notice that it adds two placeholder nodes that act as book-ends for the set of patterns that AutoDJ controls.

  7. I'm guessing you mean audio frequencies. Right now, patterns are given the "high", "mid", and "low" levels, which have hard-coded cutoffs (from the top of Audio.c.) There are plans to expose these in the future but right now they are fixed.

  8. The youtube and mpv nodes are the best way to get data into Radiance. You can pull in webcams, screen capture with X11grab, videos from sites like Youtube, and livestreams. I would suggest playing around with mpv on the command line to get a sense of its capabilities. Getting video out of Radiance can currently only be done to a monitor or to a light display. You can add a ScreenOutputNode which will allow you to select an output (e.g. an external monitor.) I suppose you could get the video output into another program by X11-grabbing a dummy X11 display, but that sounds terrible. Alternatively, if you have a light display to control, you should look at radiance on PIP. This will work well if you have some LED strips to run.

Thanks for reporting those two errors. It looks like wobsphere will wobble faster / more aggressively when you set its frequency higher. You can't tell without any sound input, and even then it might be subtle. I can definitely reproduce the MPV crash, so I'll fix that soon.

Thanks for your interest in Radiance, let me know if you have more questions!

@Reio22
Copy link
Author

Reio22 commented Oct 29, 2019

Thank you for taking the time for this detailed answer! I will continue testing soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants