Quantcast
Channel: Tutorials - VDMX - MAC VJ SOFTWARE
Viewing all 156 articles
Browse latest View live

Visualizing Audio Analysis FFT and Waveforms

$
0
0

Along with the produced control values that are used to directly automate interface controls like sliders and buttons, the Audio Analysis plugin in VDMX provides its input waveform and FFT values encoded as grayscale video streams that can be used to create real-time music visualizers and advanced sound reactive effects.

For this video tutorial we'll be showing off how to use these video feeds with two included sample ISF generators.

Example FFT Color Lines VJ visualizer

Example FFT Color Lines VJ visualizer

In the first example source, FFT Color Lines, there are inputs for showing both the current FFT and Waveform feeds for making a VJ style audio oscilloscope visualizer. Just select the audio analysis options from their respective drop down menus in the layer source controls to connect the data to the composition.

The second example, FFT Spectrogram, makes use of ISF's ability to buffer video feeds to create a rolling spectrogram readout of frequency data over time.


Tips and Notes:

Audio Analysis FFT and Waveform values are available as video inputs to ISF sources and FX.

Audio Analysis FFT and Waveform values are available as video inputs to ISF sources and FX.

Multiple audio analysis plugins can be used to visualize different devices and channels.

Multiple audio analysis plugins can be used to visualize different devices and channels.

Create your own ISF based visualizer styles using a text editor.

Create your own ISF based visualizer styles using a text editor.

  • The Waveform image is a 1 pixel height image that contains the most recent raw incoming audio samples where the brightness of each pixel is the amplitude at a particular time from left to right. The width will vary depending on the audio device used but will typically be 512 pixels.
  • The FFT image is also 1 pixel height but half the width of the waveform (256 pixels). The brightness of each pixel is the amount of energy detected at the frequency window from left to right.
  • These video feeds also work anywhere a video tap receiver is found. Try using with the Layer Mask FX or as directly as a layer source for interesting results.

More Fun Audio Analysis Techniques

$
0
0

Download the completed project file for this tutorial.

Also in this series, Three Different Ways to use Audio Analysis (or other data-source) to Trigger a Movie.

Audio analysis used to drive LFO and movie rate sliders to create complex FX behaviors.

Audio analysis used to drive LFO and movie rate sliders to create complex FX behaviors.

Using individual plugins to directly automate the control of different aspects of a live VJ performance or interactive installation can create some pretty awesome results, but one of the most powerful capabilities of VDMX is the ability to connect these plugins to each other to create more complex behaviors, interactions and visual effects.

For this set of tutorials we'll look at a few ways that the Audio Analysis plugin can be used alongside the LFO plugin and other standard interface controls as a demonstration of this technique.

Tip: Keep in mind that this same idea can be applied to other types of control plugins such as step sequencers, or with external controllers over MIDI, OSC, and DMX.


To begin we'll add an audio analysis plugin and load in some media files to work with. The first trick will be to use the provided data-sources to drive to rate slider in the movie controls so that it only plays forward when there is an audio input. Alternatively the time slider can be directly driven by the values to scrub through the movie.

Next the same technique will be applied to an LFO plugin such that it only progresses forward or scrubs through the timeline of a waveform. By making a more complex piecewise waveform made up of sine waves of different frequencies and amplitudes we can make visual FX that become more intense as the audio level increases. This is particularly useful for FX like bump distortion that wobble around a center point.

Lastly we'll use slider presets to set up keyboard shortcuts for toggling each of these automations on and off during a performance.


Tips and Notes:

Create piecewise waveforms for the LFO in the UI Inspector window.

Create piecewise waveforms for the LFO in the UI Inspector window.

Choose the “Peak Frequency Magnitude” or a specific filter range.

Choose the “Peak Frequency Magnitude” or a specific filter range.

Slider Presets can be used to toggle automations on and off.

Slider Presets can be used to toggle automations on and off.

Video Fundamentals – Part 3 – Visual FX

$
0
0

Video Fundamentals Table of Contents:

  1. General Overview
  2. Video Sources
  3. Video FX

3.0 – Introduction to Visual FX

Chroma Desaturation Mask FX removes all color from an image except the one specified.

Chroma Desaturation Mask FX removes all color from an image except the one specified.

Once a video source is playing on layer, the next step in the process of visual performance is to apply real-time FX to each frame to change the way it looks before it is shown to the audience.

In some cases the FX being used are in a utility function, for example a Color Correction filter makes it possible to adjust the contrast, saturation, and brightness levels of an image for calibration. Others are designed to stylize the image to match a particular aesthetic such as glitch or film.

This set of video fundamentals tutorials first reviews the basics of FX in VJ software and then takes a look at each of these two main use cases. In the final section we'll open up some of the included Quartz Composer and ISF based FX to create customized versions with additional input parameters.


3.1 – How to Use Video FX

While the exact interface will vary, most VJ software has the ability to apply real-time image filters to video. Regardless of the plugin type, when loaded onto a layer each FX has a similar set of interface controls.

Since the levels of each parameter for image filters can be changed completely on the fly, any FX can be set to react to input from Audio Analysis to be sound reactive when working with musicians, automated with LFOs and Sequencers, or otherwise driven by external MIDI / OSC / DMX hardware.

Tips and Notes:

  • There are a handful of different formats that are commonly supported such as CoreImage, Quartz Composer, FreeFrameGL and ISF.
  • In VDMX every FX has its own wet/dry control and on/off button along with standard UI items for parameters specific to the filter.

3.2 – Utility FX

Certain FX are designed for general purposes such as masking, color correction and resizing an image. They can also be used for fixing problems with images like de-interlacing old DV footage or sharpening a slightly out of focus camera feed.

While they are typically intended for simple adjustment, many utility FX can also be applied stylistically by changing their parameters in a rhythmic manner using step sequencers, audio reactivity or other control plugins.

Case Study: Using a mask to apply an FX to only part of a layer.

Tips and Notes:

  • In VDMX the Color Adjustment, Geometry Adjustment and Masking categories contain many common utility FX.

Assignment:

  1. Practice removing a solid color background with a chromakey effect.

Challenge:

  1. Download and use the custom Face Tracking FX and data-sources for VDMX.

3.3 – Stylize FX

Many of the visual FX found in VJ software are used to change the look of a video stream in a more artistic fashion to achieve a specific aesthetic.

For example, the Bad TV glitch FX can be used to make high-def digital clips appear to have an older analog feel and the Thermal Camera FX can simulate FLIR style heat vision footage.

Case Study: DJ Mixer EQ Style Masking FX

Tips and Notes:

  • Apply multiple FX to a layer to create new looks and styles. Favorite FX-chains can be stored and loaded both locally on layer or globally as saved assets.
  • The standard wet/dry slider and blend mode settings can be used to adjust the look of a stylize FX even if it has no other parameters.

Assignment:

  1. Download and use the Grid Pro template.

Challenge:

  1. Assigning a custom FX chain to a media bin clip.

3.4 – Modifying, Creating and Installing Custom FX

Part of the process of applying image filters is to find your own visual style to set you apart from other VJs. While there are thousands of possible ways to combine the various pre-made FX that are available, sometimes the look you want to achieve might require installing a 3rd party add-on or creating your own.

Depending on the type of plugin, for example with Quartz Composer or ISF based FX, you can use existing FX as starting points to create your own custom versions with extra parameters or features.

Case StudyInstalling and Using The v002 Quartz Composer FX Plugins

Tips and Notes:

  • ISF based FX end in “.fs” and are located in “/Library/Graphics/ISF
  • Quartz Composer based FX end in “.qtz” and are located in “~/Library/Application Support/VDMX/qcFX/
  • When creating a custom version of an existing FX, make a duplicate before changing the file instead  of modifying the original.
  • Read more about the ISF specification and download the test+tutorial filters.

Assignment:

  1. Create or Modify an ISF Based FX.
  2. Create or Modify a Quartz Composer Based FX.

Challenge:

  1. Find and install a FreeFrameGL Plugin.

Manually Setting a MIDI, OSC or DMX Address for a Slider or Button

$
0
0

While the Hardware Learn Mode and detect methods are often the fastest way for a VJ to connect sliders, buttons and other elements from a physical controller to their corresponding UI items in VDMX, sometimes it can be also useful to manually type in these addresses instead. For example, you may need to do this if some item on the controller sends multiple MIDI values at the same time, or when attempting to set up a project working from a spec sheet when the device isn't actually plugged in.

This setup technique can be accomplished by using the “Receive” tab in the UI Inspector where individual receivers for an interface control can be directly configured.

The + / - buttons are used to add receivers to the control. Once created you can click on the text string to manually enter an address path to receive MIDI, OSC, DMX, keyboard or any internal data-source available in VDMX.

Along with being able to directly set one or more address paths to listen to, the inspector and its sub-inspector contain lots of other options for setting how the incoming values should be interpreted or translated before being applied. For example, the directionality of a slider can be flipped by enabling the “Invert Val?” toggle in the sub-inspector.


Tips and Next Steps:

Sending DMX From a VDMX Color Picker

$
0
0

Like most interface items in VDMX, the Color Pickers can send their current state over the DMX protocol for syncing with lighting consoles and fixtures making it possible for a VJ to control both the visuals and lights at a live event. In this video tutorial we'll set up a DMX controllable lighting fixture and set up a Control Surface plugin with interface items for adjusting each of the available parameters.

Before getting started you may need to configure your system Network settings in System Preferences and the DMX / ArtNet settings in the VDMX preferences. The exact settings will vary depending on the equipment being used. Also note that the hardware being used may come with additional setup software you may need to run.

System Preferences and ENTTEC NMU setup program to configure the hardware.

System Preferences and ENTTEC NMU setup program to configure the hardware.

The VDMX5 Preferences for setting universe addresses for each DMX input and output port.

The VDMX5 Preferences for setting universe addresses for each DMX input and output port.

Simple DMX Fixture, “ENTTEC Open DMX Ethernet” and Thunderbolt to Ethernet adapter.

Simple DMX Fixture, “ENTTEC Open DMX Ethernet” and Thunderbolt to Ethernet adapter.

For this basic setup we have a single port ENTTEC Open DMX Ethernet converter but any ArtNet compatible device will work. We're connected directly to the device over cat-5 using a Thunderbolt to Ethernet adapter as pictured above. The fixture being used accepts 4 DMX channels, 1 for strobe / brightness level and 3 for the RGB color.

To begin we'll add a Slider to the Control Surface and set it to send on channel 1 for the strobe rate / brightness level.

Next to control the color of our light we'll add a Color Wheel element and use the UI Inspector to add a DMX Sender. From the sub-inspector we can adjust the channel offset to match the template of the fixture.


Notes and Next Steps:

The UI Inspector Sending tab for the Color Wheel showing DMX Sender options.

The UI Inspector Sending tab for the Color Wheel showing DMX Sender options.

Using Video Game Controllers With VDMX

$
0
0
Using a PS3 based HID Controller with VDMX

Using a PS3 based HID Controller with VDMX

While many VJs and musicians use equipment designed to look like a traditional instruments for running their visuals, another great way to trigger video clips and FX during a performance is with gaming controllers.

In this tutorial we'll look at three different examples of using such setups in VDMX, but as always the real fun is adding them into your own projects and templates.


1. HID (Human Input Device) Game Controllers

Using the “HID Input” plugin you can receive button and joystick data from many generic USB or Bluetooth based game pads. Once selected in the device menu any detected elements from the controller will be available for hardware learn and assignment as standard data-sources.

To get started quickly with using the standard bluetooth PS3 HID Controller with VDMX try using the example template which includes a pre-mapped Control Surface plugin for each element on the device. This technique can be useful for visualizing the current state of a controller or for testing behaviors when a device isn't available to play with.

Note that the HID specification also includes lots of other types of devices that can be mapped in interesting ways for VJing.

 

2. Keyboard Input Based Game Controllers

Some game pad controllers such as the X-Arcade consoles show up as a USB keyboard and send standard key commands when buttons are pressed or joysticks are moved. When using these controllers no extra plugin is needed in VDMX, each assignment can be made using hardware learn mode or manually entering the data-source paths in the UI Inspector pane.

It is important to remember that since these are registered as keyboard presses, they will only be picked up by VDMX when it is the frontmost application. You may also want to consider this if you need to live type text during your performance.

 

3. WiiMotes

Similar to using HID based inputs, the “WiiMote” plugin in VDMX will make the buttons, joysticks and accelerometers of a bluetooth based WiiMote available as standard data-sources. For more information also see the in depth WiiMote tutorial page.

For this video tutorial we'll start by enabling Bluetooth communication and detecting the controller with the WiiMote plugin in VDMX. Once the basic configuration is working we can begin to put together a sample project that is designed to get the most out of the possibilities of the combining accelerometers with game style buttons for performing live visuals.


Tips and Notes:

Assign buttons from the right-click detect, UI Inspector, or Hardware Learn.

Assign buttons from the right-click detect, UI Inspector, or Hardware Learn.

Pair Bluetooth HID and WiiMote controllers in the System Preferences.

Pair Bluetooth HID and WiiMote controllers in the System Preferences.

We used the X-Arcade Tankstick + Trackball model in this tutorial.

We used the X-Arcade Tankstick + Trackball model in this tutorial.

Luma Key Techniques for Layer Composition

$
0
0

Download the completed project file and sample media for this tutorial.

Prerequisites:

Using Masks and Luma Keys to create a layered composition.

Using Masks and Luma Keys to create a layered composition.

  1. Read the tutorial on Layer Composition which covers opacity and blend modes.
  2. Read the tutorial on Adding an FX to a layer.
  3. Read the tutorial on Applying a Mask to a layer.

One of the most powerful techniques for combining multiple layers of video into an output is the use of masking, also known as luma keying. With this process, two video sources are combined to create a "cut out" layer that can be composited over others image like a collage instead of simply blending them together. This style is commonly found in music videos, graphic design, and by VJs for live performance. Masking is also an important technique used when projection mapping video onto surfaces.

The basic idea behind this process is easy to learn and can be repeated any number of times within your VDMX project to create complex layered visual scenes.

Typically for our Mask we'll want to use a video source that has clearly defined shapes or patterns, preferably black and white or high-contrast. A few examples of good masks, some that we'll use for this example:

When applying the mask to a layer using the Layer Mask FX, the regions that are white will become textured by the source video, and the regions that are black will show through to video playing on the background layers behind it. Areas that are gray will mix between the two.

Foreground Layer before mask is applied.

Foreground Layer before mask is applied.

Masking Layer provides the "cut out" shape.

Masking Layer provides the "cut out" shape.

Foreground Layer after the mask is applied.

Foreground Layer after the mask is applied.

Composited over a background.

Composited over a background.

When creating layered compositions using luma keying, a useful tip is to organize media files into three types: overlays, masks and backgrounds. The masks will provide the "cut out" shapes that appear over the background clips and the overlays will be the texture video that fills our shapes. In this video tutorial we'll begin by loading our files onto separate pages for these purposes.

After the media files are loaded we'll move to the Workspace Inspector in the Layers tab where the layers from our scene can be managed. For the basic example we'll create three layers: a Foreground, Background, and Mask.

As detailed in the applying a mask to a layer tutorial, once this basic layout is prepared we need to adjust two layer settings and add an FX. First, in the Layer Composition Controls for the Mask Layer, set the Opacity to 0.0 (or use the hide/show button) so that the mask is not visible in the final output directly.

Next on the Foreground Layer set the Composition Mode to "OpenGL-Over" (or "VVSource-Atop") and add a "Layer Mask.fs" FX with its Mask Image input set to receive from the "Mask Layer" that we've created.

Once this is complete to make our scene more visually interesting we'll then repeat this process to add a mid ground layer that appears in between the foreground and background layers.


Tips and Notes:

Use the OpenGL-Over composition mode on foreground and overlay layers.

Use the OpenGL-Over composition mode on foreground and overlay layers.

Use the Chroma Mask FX to convert blue screen footage to a grayscale masks.

Use the Chroma Mask FX to convert blue screen footage to a grayscale masks.

Text layers and other built in sources are also useful as mask generators.

Text layers and other built in sources are also useful as mask generators.

null

Connecting VDMX and Unity3D by Syphon

$
0
0

Download the completed VDMX and Unity3D project for this tutorial.

Fun dome projections during our visit to the LES Girls Club.

Fun dome projections during our visit to the LES Girls Club.

Note: This is an updated tutorial topic – Recently the Unity3D game engine was updated so that the personal edition could use 3rd party add-ons, a feature that was previously limited to their pro version.

While mainly designed for cross platform game development many Mac VJs take advantage of the Unity3D engine for the purposes for creating 3D worlds and other real-time generated graphics for use in visual performance. By connecting these environments to other VJ applications like VDMX over Syphon and OSC we can control these worlds and mix, process and output the virtual camera signals from a scene like any other live media source.

In this guest tutorial we're joined by Dave Pentecost, manager of the planetarium at the Lower East Side Girls Club of NYC. Along with weekly tours of the known universe in their dome, Dave teaches the girls how to use the software needed to make their own digital creations to project into the space.

This example begins by creating a simple scene in Unity3D that contains a plane and a basic character package known as Ethan.

Once this scene is created the next step is to add the Syphon asset package to the Unity3D project. For each of our three surfaces (Ethan's body, glasses and the ground) a different material is created. The next step is to create a matching layer for each of the surfaces in VDMX to provide a video streams for display in the scene using the Syphon Output plugin.

Tips and Notes:

  1. Visit the Github page for the latest version of the Syphon plugin for Unity3D. No additional installation is needed to publish or receive video over Syphon in VDMX.
  2. Enable the "Run in Background" option in the Unity3D Resolution and Presentation settings to keep the scene active when VDMX or other VJ applications are being used.
  3. After getting VDMX to send video to Unity3D, try publishing a camera in the Unity3D environment to Syphon and receiving it on a layer in VDMX
  4. To send control information besides video between applications check out add-ons for receiving OSC messages in Unity3D from VDMX data-source providers.

Guest Tutorial: 10 Workflow Tips from DocOptic

$
0
0

We really enjoyed this tutorial from the DocOptic crew who were gracious enough to let us share it here with some extra notes.

DocOptic is an independent team of artists with a background in the creation and performance of live visuals, 3D motion graphics, and music production. We also enjoy sharing knowledge with new and existing audio-visual performers through the creation of tutorials that explain how various live visual software works and how they can be used in live performances.

This tutorial goes over some of our favorite tips we use to improve our workflow while using VDMX including keyboard shortcuts, BPM automation, presets, and more. Also covered are a few techniques using our most used features of VDMX such as the Alpha Mask effect and using application windows as media sources.

 


Tips and Notes:

Save and recall the states of your parameters using Presets.

Save and recall the states of your parameters using Presets.

Remove footage backgrounds using the Alpha and Layer Mask effects.

Remove footage backgrounds using the Alpha and Layer Mask effects.

Create a Control Surface to easily access your favorite effects and parameters.

Create a Control Surface to easily access your favorite effects and parameters.

Use marks to trigger different playback positions of your media.

Use marks to trigger different playback positions of your media.

Use other applications applications as media sources with Window Inputs.

Use other applications applications as media sources with Window Inputs.

Automate parameters to BPM to keep your visuals in sync with audio.

Automate parameters to BPM to keep your visuals in sync with audio.

Analyzing multi-track audio from Live in VDMX using Soundflower

$
0
0

For musicians working in Ableton Live or other multitrack production software one of the most useful tricks for driving real-time visuals is to output each sound track on a different set of audio channels before they are mixed together to get more accurate results for each sound when performing audio analysis in VDMX.

In this video tutorial we'll look at how to accomplish this technique by using the Soundflower audio routing system extension which allows passing of audio streams between applications.

This same idea can be used with bands or sound boards by using a multi-channel audio inputs that are receiving each instrument before they are mixed.

To begin, set the output device in the Live audio preferences to use Soundflower 64 and configure the needed output channels. Next switch each track in the main Live view to send on a different external channel send instead of the master feed. If needed, use the Soundflowerbed utility to route the audio to your speakers for preview.

Once the audio side has been set up, in VDMX we'll create an Audio Analysis plugin for each of the tracks playing in Live. For each plugin a new set of frequency bands can be tuned to the incoming sounds. The individual data-sources are available for controlling the FX and source parameters of any layer.


Notes and Next Steps:

Download and install Soundflower.

Download and install Soundflower.

Set the output device and track channels in Live.

Set the output device and track channels in Live.

Create multiple audio analysis plugins in VDMX.

Create multiple audio analysis plugins in VDMX.

Using audio analysis to trigger a movie.

Using audio analysis to trigger a movie.

Visualizing audio FFTs and waveforms.

Visualizing audio FFTs and waveforms.

More fun audio analysis techniques...

More fun audio analysis techniques...

4 Layer Korg nanoKONTROL2 template

$
0
0

Download the Korg nanoKONTROL 4 Layer Example with Sample Media

One of the popular controllers used by VJs is the Korg nanoKONTROL, a versatile set of sliders, knobs and buttons that can be easily mapped to different setups. The goal of this more generalized setup is to provide a good standard VJ rig for this controller that includes 4 layers with playback / mixing / color adjustment, clip / page switching along with a set of both manual and audio reactive FX that can be individually enabled.

For this project the layout is split into three sections, with playback control on the left, mixing in the middle 4 sliders and audio reactivity on the right 4 sliders.

4 Layer VDMX setup for Korg nanoKONTROL

4 Layer VDMX setup for Korg nanoKONTROL

Example project with controller and display

Example project with controller and display

Main controller layout

Main controller layout

In the playback controls section the buttons under the 'TRACK' heading are connected to changing pages and moving to a random / previous / next clip on the current page. Below the 'MARKER' heading are buttons for adjusting the movie rate, direction and jumping the playback backwards and forwards.

The mixing section in the middle contains the controls for adjusting the opacity slider and luma cutoff knob for each of the 4 layers. Additionally each layer has 3 optional style FX (Invert, Mirror, RGB trails) that can be enabled using the S, M, R buttons corresponding to each layer.

The audio reactivity section is used to adjust the gain level for the FX enabled on the corresponding layer. Like with the mixing section the S, M, and R buttons are used to enable the Shake, Blur and Dot Screen reactivity. Above each slider is a knob for manually shifting the hue of the layer for changing the color palette of your visuals.


Notes:

Playback controls in detail

Playback controls in detail

Mixing and manual FX in detail

Mixing and manual FX in detail

Audio reactive and color FX in detail

Audio reactive and color FX in detail

Using the ISF Editor To Create GLSL Generators and FX

$
0
0

Download the latest ISF Editor to follow along with this tutorial.

The ISF Editor displaying the Random Shape generator.

The ISF Editor displaying the Random Shape generator.

When creating new ISF based generators and FX for use in VDMX or other supported applications, one of the most useful tools is the free ISF Editor. In this video tutorial we'll be looking at the basics of using the ISF Editor to create a basic generator and FX.

While you can use any text editor to work with ISF files, the ISF Editor includes several useful features such as:

  • Live preview of output from the shader.
  • Create new ISF shaders from a useful starter template.
  • Use movies, images, Syphon, and camera inputs as test feeds for FX.
  • Publishes final output to Syphon for live use or preview in other applications.
  • Render generator ISF files out as movie files.
  • Error message display for debugging.
  • Preview any render pass for debugging multi-pass shaders.
  • Imports (some) shaders from GLSL Sandbox and Shadertoy.

Before getting started, you may also find it useful to load the ISF Specification page in another web browser window as a reference.


In the first section of this tutorial we will create a simple ISF based FX starting from the “New ISF file” option in the File menu.

Using this menu option creates a new ISF based FX with a pass-thru template that includes an example of each input variable type.

For this walk through we will create a simple FX that demonstrates two basic image transformations: moving a pixel and modifying its color.


For the second part of this tutorial we will import an existing shader from GLSL Sandbox, add our own custom parameters, animate a property using the automatic TIME variable and then export it as a movie file.

The first step is to choose the “Import from GLSL Sandbox” option in the File menu and entering the link from the website. The automatically created ISF file will be saved to “~/Library/Graphics/ISF” where applications such as VDMX will find it.

For shaders that are too intensive to play in real-time, or for creating a demo reel for the web, the “Render to movie...” option in the File menu can be used to export a movie file that can be played instead.

For situations where the ISF Editor can not automatically import shaders from other sites, try consulting the notes for manually converting shaders on the ISF Specification page.


Tips and Notes:

Debug messages can be found in the Error console.

Debug messages can be found in the Error console.

Preview individual render passes in viewer window.

Preview individual render passes in viewer window.

Export ISF generators as movies.

Export ISF generators as movies.

Receive from and publish back to Syphon.

Receive from and publish back to Syphon.

Use any text editor to work with ISF files.

Use any text editor to work with ISF files.

http://www.interactiveshaderformat.com/ has more example ISF files to get started.

http://www.interactiveshaderformat.com/ has more example ISF files to get started.

Media Bins: Triggering Multiple Clips At The Same Time

$
0
0

Download the completed project files for this tutorial. Download sample clips to use with these projects.

Example project using multiple triggers from a single media bin.

Example project using multiple triggers from a single media bin.

One of the common things you may want to set up using media bin plugins is the ability to trigger multiple clips to different layers at the same time. This allows a VJ to match specific clips for mixing in advance and bring them up with a single keypress or MIDI note. Within VDMX this can be configured in a few different ways depending on your use case or interface layout preferences.

Before starting this tutorial you may want to review some basic notes on using the media bin.

When finished with this tutorial you may want to check out how to work with multiple output screens or learn more about different types of video sources available in VDMX.


In the first example we'll use a single media bin that is configured to receive key presses that are used to trigger clips. From the media bin inspector in the triggers tab you can override the default bin layer for each receiver to target a specific layer for playback. For each layer needed an extra receiver / layer target pair is added. If needed repeat for additional key presses, or used along with the transpose up / down options in the control tab. Once your bin is set up you can move clips around (shift+click to select then drag) to change which layer they are played back on.

Tip #1: Instead of key presses you can use any data-source including MIDI, OSC and DMX messages.

Tip #2: In the Media Bin inspector under “options” you can find settings to hide trigger overlays and preload media files.


For the second example we'll be using multiple media bins that are each set to trigger to specific layers.

This technique can be especially useful if you are also using Two Channel Mixer plugins to auto-fade between clips. To begin we set up a single bin with the triggers that will be received. After the triggers are set we can use the menus tab to select which layers and pages are used by this particular bin. Next we can use the duplicate plugin option to create a copy of this media bin. We'll change its layer/page settings but keep the trigger receivers the same. This can be repeated if more layers are being used.


A third useful trick is similar to the previous, but instead of having each bin set up to receive the same note triggers we'll use one media bin as a master media bin whose triggers are passed on to activate clips in other bins.

Along with being able to receive from data-sources each media bin also publishes the index and normalized (0.0 to 1.0) value of each clip trigger. This value can then be be used by other bins via the “trigger by index” or “trigger by float” options in the control tab of the inspector. One advantage of using this technique is that you can use mouse clicks in the master bin to trigger multiple clips.


Notes:

Using the “triggers” section of the media bin inspector to set target layers for individual receivers.

Using the “triggers” section of the media bin inspector to set target layers for individual receivers.

The same triggers can be used for multiple media bins.

The same triggers can be used for multiple media bins.

Use the “Triggered Index” data-source from your master media bin with the “Trigger by Index” control receiver.

Use the “Triggered Index” data-source from your master media bin with the “Trigger by Index” control receiver.

Using ProjectMilkSyphon With VDMX

$
0
0

Download ProjectMilkSyphon or read more on the release page.

ProjectMilkSyphon includes hundreds of amazing community made presets.

ProjectMilkSyphon includes hundreds of amazing community made presets.

While the true mastery of VJing often lies in painstaking creation of your own material and performing it live, there are times when it helps to have a solid audio visualizer in your back pocket. It might be something that the client asks for or you might need to let something run automated for a few minutes while you go fix some cables that came loose. Or we can be honest and admit that there are even some times when it is the best tool for a job.

For those who aren't familiar, MilkDrop started as a WinAmp plugin that was one of the first audio visualizer tools to break into the mainstream. In more recent years the projectM community has been working on an open source implementation that developers can use cross platform.

ProjectMilkSyphon is a free standalone app for the Mac that can be used alongside any Mac VJ application that support Syphon inputs. In this video tutorial we'll begin by looking at the options available within ProjectMilkSyphon and demonstrating how to receive the video feed in VDMX.

To begin, download and launch ProjectMilkSyphon. If this is the first time you are running it you may be prompted to install addition files before proceeding (installed presets are in the "/usr/local/share/projectM/" directory). The application consists of two windows, one for output and the other for adjusting settings.

From the main list in the Settings panel you switch between the active visualizer preset. If needed adjust the math resolution and output resolution options to meet your CPU / GPU needs. The auto-trigger mode can be enabled to step through the list of presets or select a random visualizer at specified intervals.

Tip: To receive audio directly from iTunes or other music software try using SoundFlower.

Tip: Check out more tutorials on making audio reactive visuals for VDMX.


Here are some examples of the output from ProjectMilkSyphon:

16.04.20-ProjectMilkSyphon-ProjectMilkSyphon-12.08.42.png
16.04.20-ProjectMilkSyphon-ProjectMilkSyphon-12.08.10.png
16.04.20-ProjectMilkSyphon-ProjectMilkSyphon-12.07.58.png
16.04.20-ProjectMilkSyphon-ProjectMilkSyphon-12.07.40.png
16.04.20-ProjectMilkSyphon-ProjectMilkSyphon-12.07.18.png
16.04.20-ProjectMilkSyphon-ProjectMilkSyphon-12.07.04.png
16.04.20-ProjectMilkSyphon-ProjectMilkSyphon-12.06.24.png
16.04.20-ProjectMilkSyphon-ProjectMilkSyphon-12.08.24.png

Notes

Settings panel in ProjectMilkSyphon

Settings panel in ProjectMilkSyphon

Syphon feeds can be directly received by layers or added to media bins in VDMX.

Syphon feeds can be directly received by layers or added to media bins in VDMX.

Using VDMX as a Syphon Mixer

$
0
0

Download the completed project file for this tutorial.

Mixing three Syphon sources in VDMX

Mixing three Syphon sources in VDMX

One of the best things about being a VJ on the Mac these days is Syphon which makes it possible for all of the different tools that are available to work together in countless ways. Within VDMX it is possible to have as many Syphon inputs and outputs as your computer can handle, which allows for it to be used as a source, mixer, FX processor or final output for other software you may want to work with.

In this video tutorial we'll look at a simple use case for connecting several Syphon enabled applications to and from VDMX by creating a two channel mixer that fades between two Syphon sources and publishes back out to for other apps to receive.

Tip: While you do not need to install anything extra to use Syphon, you may find it useful to download the “Simple Server/Client” test applications from the Syphon website.

To begin this basic setup we'll be using VDMX alongside two free Syphon generator applications that you can download from our site: GifToSyphon and ProjectMilkSyphon. Along with these we'll also be using the Simple Server program and some simple ISF based generators which are good to have on hand for testing purposes.

For each of the sources that we'll be expecting to receive we can set up a different Preview window to monitor the feed before it gets triggered. Adding a second layer and a crossfader plugin enables us to mix between any of the two feeds. You can set the media bin to automatically cycle between the two layers, or manually switch which layer is being targeted.

To finish the setup we add a Syphon Output plugin to the project which lets us then publish back out to be received by the Simple Client app to confirm that the output is working. At this point you can also receive the mix from VDMX in other software you may want to use for final output or further processing.

Next learn more about adding FX to layers in VDMX.


Notes:

Use apps like ProjectMilkSyphon as a source in your favorite VJ software.

Use apps like ProjectMilkSyphon as a source in your favorite VJ software.

Preview plugin can view Syphon sources even when not active on a layer.

Preview plugin can view Syphon sources even when not active on a layer.

Use the Syphon Output plugin to publish any layer or group to Syphon.

Use the Syphon Output plugin to publish any layer or group to Syphon.


Using LUT FX in VDMX

$
0
0

One of the common types of image filters that are found in the workflow for photo and video editing are LUTs, also know as "Look Up Table" based FX. LUT FX are used to change the color palette of an image to create a different stylized look or feel, or in some cases to mimic the look of different print film types.

A few example images demonstrate the power of applying different LUTS to the same original video frame:

Original

Original

Adventure

Adventure

Dystopia

Dystopia

Ilford HP5

Ilford HP5

Another cool thing about LUTS is that you can create your own easily without having any programming knowledge. Using software like Adobe Photoshop you can load in an image, adjust its color curves and then export the color shifts to the ".cube" LUT format to use in VDMX. You can also search online to find lots of free LUTS created by other people.

When used in VDMX, LUTS can be applied any media layer in real-time, whether it is a live camera or prerecorded footage, just like how other FX are added.

In the LUT category of FX, there are three different options:

  • LUT: Pick a single LUT to apply to the incoming video frame.
  • LUT Mixer: Pick two LUT styles to apply and crossfade between.
  • LUT Mask Mixer: Pick two LUTS and use a masking image to provide per-pixel mix levels between them.

In this video tutorial we'll be looking at how to use each of these different FX and how to add your own ".cube" LUT files.

When you've mastered this, try using a Movie Recorder plugin to capture the results as video files / still images, or check out other ways to use image masks in VDMX.


Notes and Tips:

Access the LUTS Assets folder for VDMX from the Help menu.

Access the LUTS Assets folder for VDMX from the Help menu.

Add LUT FX to any layer.

Add LUT FX to any layer.

Crossfade between two styles with the LUT Mixer FX.

Crossfade between two styles with the LUT Mixer FX.

Create complex mixes with the LUT Mask Mixer FX.

Create complex mixes with the LUT Mask Mixer FX.

Permalink

Gesture Recording With the Data Looper

$
0
0

The Data Looper plugin in VDMX lets you create tracks that record data (values) from a data source, and then loop that data back, publishing it to the track's data source. Recording and playback is always quantized to the chosen clock, and the plugin also has a built-in editor that allows for quick and extensive modification of the recorded data, including scaling, warping, translation, and deletion.

In this tutorial we'll be looking at how to use the Data Looper to record incoming MIDI data and loop it quantized to the VDMX clock. For more information also see the Data Looper section of the VDMX wiki.

Playback and recording is track-based: a single plugin can have multiple tracks. Each track's value is published as a data source in VDMX. You can map these data source using any of the usual means, or by ctrl-click-dragging from the track preview (or the track editor) to any UI item. All tracks in a plugin instance are quantized to the same clock, which is selected by the pop-up button in the top-right of the plugin window. This means that all recording and playback is quantized- as long as your clock is in sync with the real world, everything you record and loop will also be in sync.

In the main plugin interface choosing "View All" displays an thumbnail preview and basic controls for every track in the plugin. Each track has its own set of controls for recording and playing back data, which are visible in this view- there's a pop-up button for changing the track's mode, and a value slider which is used by many of the modes. Every track has a "mode"- this mode determines what the track is doing at any given point of time.

For making adjustments to the recorded data the track editor interface allows you to make chronological modifications to the track's data. You can switch to the editor UI by either double-clicking on a preview in "View All" or selecting a specific track from the plugin's pop-up button. The main editor area has three basic sections- there's a small strip along the bottom that can be used to change the duration of the track. This automatically quantizes edits to the nearest full measure. Along the top of the editor is another strip- this strip lets you create or reposition warp points. Warp points are temporary markers that are used to distort or rearrange the data in a track. You click in the strip to make a warp point, and if you want to reposition a warp point you just click and drag it around. Warp points can be deleted by "tearing them off" the strip. The majority of the editor is taken up by a preview of the data you recorded. if you click on a warp marker in this area and drag it, the data on either side of it will be distorted. you can also click and drag on the space between warp points to slide it around, and rearrange the track data. If you hold down option while you click and drag, you'll copy the data before moving it.


Notes:

A Data Looper plugin displaying all of it's tracks, both of which are playing back.

A Data Looper plugin displaying all of it's tracks, both of which are playing back.

A Data Looper plugin, editing one of its tracks.

A Data Looper plugin, editing one of its tracks.

The Data Looper's inspector options.

The Data Looper's inspector options.

Introduction to The Timecode Plugin

$
0
0
Timecode plugin driving VDMX parameters.

Timecode plugin driving VDMX parameters.

When designing preplanned shows and working alongside other software, one of the common tasks is keeping the timing of everything in sync. Within VDMX there are two main ways of working with time – the Clock plugin which is used for working in measures and beats and the Timecode plugin which counts in SMPTE time.

In this set of tutorials we'll cover the basics of using the Timecode plugin which publishes several data sources in VDMX, and is capable of both receiving and sending SMPTE timecode in a several of common formats.

Timecode Overview:

  • A single Timecode plugin can only receive from one source at a time- it can receive from MTC (MIDI Timecode), LTC (Linear timecode), or from any data source in VDMX (including any MIDI/OSC/DMX data VDMX receives). Timecode plugins can also generate their own timecode locally, using a variety of framerates.
  • The values received by a Timecode plugin are published in VDMX as a data source as a floating-point number describing the time in seconds.
  • These values can be used to drive Cue Lists, LFOs, control movie playback directly, etc.
  • A single timecode plugin can have multiple reference times, which are configured in its inspector- the time passed since the last reference time is published as a data source in VDMX, along with the index and name of the reference time.
  • A single timecode plugin can publish its value to multiple destinations, in multiple formats. Values can be sent to other devices using MTC, LTC, and OSC.

Introduction to Timecode

To begin we will look at some of the ways that Timecode plugin can be used to directly control parameters within VDMX. Like most plugins this is accomplished by linking sliders, buttons and other interface items to the data-sources published by the Timecode plugin.


Receiving Timecode from MTC

One of the more common protocols for passing timecode between different software is known as MTC (MIDI Timecode) which can be both sent and received from the Timecode plugin in VDMX.

In this quick example we will use QLab to generate a MTC signal that will be received by VDMX over the system MIDI IAC Driver Bus.

Tip: For simple cases MTC can also be directly received by sliders without using the Timecode plugin.


Receiving Timecode from LTC

Another common way to send timecode between systems is using LTC (Linear Timecode) which encodes the values as sound over a standard audio connection.

In this example we will be receiving LTC being sent from another computer and being received through a USB audio interface. The sample rate of the device and the framerate of the signal will be displayed when values are being received. If needed multiple LTC signals can be received on different audio channels by adding more Timecode plugins to your VDMX project.

Tip: For working with LTC between applications on a single computer you can use SoundFlower to patch the audio.


Receiving Timecode from OSC

Though there is no formal protocol for sending timecode over OSC, you can send time values (in seconds) as float values to drive a Timecode plugin.

Like with the internally generated timecode when using OSC you can set the framerate that should be used locally.

Tip: When working with OSC the Console plugin can be used to view a log of incoming values.


Using a Backing Track Movie to Drive Timecode

In addition to slaving a Timecode plugin to external sources you can use any internal data-source to set the current time in seconds.

For this example we will use a backing track movie playing on a layer as the value that is used to drive the timing for other controls in our project.


Sending Timecode from VDMX as LTC, MTC, and OSC

Each Timecode plugin can be set to publish its current time position out to one or more other systems simultaneously in three different supported protocols: LTC, MTC, and OSC.

  • For LTC, select an audio interface and channel.
  • For MTC, select a MIDI device / bus.
  • For OSC, select a OSC output and address path.

In this example on the receiving side we will use Horae for LTC, MIDI Monitor for MTC and our own OSC Test App to print out OSC values.

Guest Tutorial: Using Lumen and VDMX together with Syphon

$
0
0

Today we have special guest Wiley Wiggins bringing us a video tutorial showing off how to use the video synth Lumen alongside VDMX by connecting them over Syphon.

Lumen now also supports MIDI so you can easily set it up to receive messages sent from any interface item in VDMX. In particular you can configure a Control Surface plugin designed to match your Lumen controls as needed.

Next up, check out the artist feature on Wiley just posted over on the blog!

Using an iOS Device as a Live Camera Source in VDMX

$
0
0

Now that iPhones, iPads and iPod Touch devices are readily available as extra mini-computers in our everyday lives, VJs and other video artists can take advantage of this by using them as additional live camera inputs in VDMX.


iOS Screen Mirroring over Lightning Cable

One of the cool new features that is available with recent updates to iOS is the ability for your Mac to directly capture the video on the display of your iPhone or iPad without any special hardware other than a standard USB to Lightning cable. Whatever is on the screen of the device will be available as a live input making it possible to use any of your favorite mobile apps as sources for VDMX.

In this first video tutorial we'll look at how to use connect an iPhone as a camera input and display a running app as a layer. Like other camera inputs we can also play-thru the audio and record directly to disk as a movie file that can be either immediately remixed or saved for later use in other video software.

The first step of is to connect your iOS device to your Mac with a lightning cable.

Mac, iPhone 5s and USB to Lightning cable.

Mac, iPhone 5s and USB to Lightning cable.

iPhone 5s connected to Mac by USB to Lightning cable.

iPhone 5s connected to Mac by USB to Lightning cable.

Once connected the iOS device will appears as an available as a source for layers under the "Live Input" category along with any other AVFoundation supported webcams, video digitizers or BlackMagic capture devices.

You can access more options for the device by going to the "Workspace Inspector" in the "Vid In" tab. From here you can configure an optional audio play-through for the sound from the device. Using the available controls you can also record audio and video directly to disk as movies which can be automatically loaded onto a page in VDMX for immediate remixing.

Tip: If you wish to record into a different video format, or need additional recording options, you can also use the Movie Recorder plugin.


Wireless iOS Camera via AirBeam and Syphon

Another useful technique when working with iOS devices is to use an app called AirBeam to transmit the camera feed wirelessly to remix in VDMX. One of the advantages of this technique is that it can work over a wireless connection which means it will also work with older iOS devices that have an older 30-pin connection cable.

AirBeam can purchased in the iOS app store. To receive video feeds on your Mac you'll also need to download their free client software which connects to VDMX by Syphon.

Once you have the software installed simply launch AirBeam on your iOS device and on your Mac. Make sure the devices are on the same wireless network. On your Mac you should see the iOS camera appear with a small preview. Within VDMX the AirBeam camera will be available in the list of Syphon inputs and can be added to a media bin page to be triggered like any other media file.

AirBeam on the Mac shows a preview of the camera and includes other useful controls.

AirBeam on the Mac shows a preview of the camera and includes other useful controls.

The raw camera feed is available via Syphon in VDMX.

The raw camera feed is available via Syphon in VDMX.


Once you've mastered these techniques for connecting your iOS device to VDMX, move on to tutorials that involve using the audio feed in fun ways or compositing the camera feed with other layers using masks.

Viewing all 156 articles
Browse latest View live