Quantcast
Channel: Tutorials - VDMX - MAC VJ SOFTWARE
Viewing all 156 articles
Browse latest View live

Automatic BPM Detection in VDMX using Waveclock

$
0
0

Often when working with a DJ and other musicians, it is the job of the VJ to keep the visual events in sync with the BPM of the music that is playing back. In some cases it is possible to get this information directly from the software they are using as MIDI Clock or MIDI Time Code, but otherwise to keep a beat clock in sync requires tapping out a tempo or manually dialing in a value.

By enabling the “Waveclock” beat tracking feature in the VDMX Clock plugin, the music from a microphone or line input can be analyzed to automatically handle the adjustment of the BPM and measure position to ensure that the timing of changes in your video are perfectly in sync with the bands and DJs that you are working with. 

For this introduction tutorial we'll begin with simply activating the Waveclock detection in the Clock plugin and playing some music to watch the beat sync indicator and BPM read out lock on. Then from the Templates menu we can load the example project to see the sync in action. In this basic demo setup, two step sequencer and LFO plugins are used to automate the controls a generative Quartz Composer composition playing on a layer.

After watching this read up some more on using color tracks in the sequencer, or try adding an Audio Analysis plugin to the template for even more A/V sync. 

Notes:

Click the Wavesum logo to enable BPM detection. In the Workspace Inspector, customize the audio input settings used for analysis.

Click the Wavesum logo to enable BPM detection. In the Workspace Inspector, customize the audio input settings used for analysis.

The Waveclock demo and other example VDMX setups can be loaded by from the Templates menu.

The Waveclock demo and other example VDMX setups can be loaded by from the Templates menu.

“ SoundFlower ” is a free sound routing utility for OS X that can be used to send audio from music apps such as iTunes and Ableton Live directly to VDMX.

SoundFlower” is a free sound routing utility for OS X that can be used to send audio from music apps such as iTunes and Ableton Live directly to VDMX.

The Waveclock standalone application can also be used with any VJ app that supports MIDI clock.

The Waveclock standalone application can also be used with any VJ app that supports MIDI clock.


How to build the Waveclock Demo template

$
0
0

Also see the intro tutorial explaining how to use the Waveclock Demo template.

For this tutorial we'll be taking a closer look at using automatic BPM detection for syncing up the timing of visual events with music by recreating the core parts of the Waveclock Demo template that is included with VDMX.

The approach we'll take is to create a virtual video instrument in the form of a Quartz Composer composition and animate its interface controls with Step Sequencer and LFO plugins. Presets for patterns in each plugin can then be saved and switched to match the energy level of the music while VJing during a live set. 

In the final example template we have two layers whose source and FX parameters are being controlled by our data-source plugins set to loop patterns that control our video generators such that they line up with the beats and major change ups in the music that is playing in the background.

Using this technique as a starting point try applying this idea to some other interactive video generators, or adding an Audio Analysis plugin to the drive some parameters for even more sound reactivity in the setup.


Notes:

Screen Shot 2013-08-28 at 3.50.29 PM.png

Use “Section Presets” to save and restore patterns in the Step Sequencer plugins.

Screen Shot 2013-08-28 at 3.52.06 PM.png

Use the “Interpolation” setting for Step Sequencer tracks to smoothly fade between values on beat.

Screen Shot 2013-06-30 at 7.19.11 PM.png

The completed Waveclock Demo example can be loaded from the Templates menu.

Music for this tutorial is Slow Motion by LB^LC, creative commons license, listen to more here.

Making three templates for the Livid Base

$
0
0

Download the example projects for this tutorial. Sample clips from Minuek can be downloaded here and here.

When getting a new MIDI controller to use with VDMX, or other VJ / music making software, one of the most exciting aspects is finding out the best way to map the sliders and buttons to various controls that you want to use during performance, and along with that coming up with new ways that you can configure your software video generators and FX to get the most out of the layout of your instrument.

In this set of technique tutorials we'll be looking at three new example VDMX setups we've come up with for the Livid Base that take advantage of the controller in a few different ways including its multi-color LEDs and pressure sensitive pads.

Each of these sample projects can be used with your existing video clips, or as a starting point for creating your own custom rig with different FX or masking layers. As usual, where possible we've tried to make each template easy to remap to other MIDI controllers in case you don't have Base of your own to use.

And and an extra shout out to Minuek for providing the sample clips used in these video demonstrations. 

Livid Base VDMX Video Synth

A video synthesizer in VDMX mapped to the Livid Base MIDI controller.


For the first example we'll begin with a basic single layer video clip triggering setup with multiple pages and preset video filters that can be activated during performance.

Though it is fairly simple, designing a layout like this is a good exercise when considering the possibilities of using VDMX with a new controller. It can also be a useful go-to template for gigs that only involve switching or fading between clips.

The top rows of buttons are used as momentary activators for different combinations of visual FX that are applied to the output. For each individual FX there is an LFO that animates its properties where the rate can be adjusted using the slider below the activator.


In our second template we'll take advantage of the Base's ability to send off pressure information from each of the main pads as MIDI control values to have instrumental control over a video synthesizer created with Quartz Composer. Each pad corresponds to a line particle at an x/y location and the pressure applied adjusts it's brightness and size.

By adding a background layer playing video loops to this setup the generator composition can be used either on its own video instrument or as a masking layer for clips playing to reveal only part of the image at a time. Unlike in our first setup, the vertical slider is used to crossfade between these two visual modes.


For the final example, as an exercise we've created a multi-layer variation on our first template as an demonstration of some of the ways you can modify it for your own needs.

The FX activators and LFOs being used to control their individual rates are the same as in the original, however they are now applied to a group that contains 4 different layers. The media bin cycles through which layer is used next when a pad is pressed to trigger.

Introduction tutorials translated to French

$
0
0

These introduction tutorials have been translated from English by Renaud Moreau.

Introduction aux Plugins LFO et Sequencer à pas

Prérequis:

  1. Lire le tutoriel afin de synchroniser un curseur de déplacement à une source de données

Les archives de projet pour ce tutoriel se trouve au-dessous ici or Read in English.

Aussi dans cette série , vous pouvez voir comment travailler avec Piste de couleurs avec séquenceur à pas y Comment mettre en place un template d'horloge d'ondes.

Etape 1. Dans l'espace de travail 'Workspace inspector' se situant dans l'onglet 'Plugins'.Créer un nouveau plugin 'LFO' et 'séquenceur à pas'.

Etape 2. Faites glisser les poignées de contrôle dans le LFO afin de modifier comment les valeurs de contrôles changent à travers le temps.Cliquez sur les cellules dans le séquenceur à pas pour changer la valeur de chaque colonne.

Astuce: Dans le LFO ,Maintenez la touche 'option' pendant que vous cliquez pour ajouter ou enlever des points de contrôles le long de la forme d'onde, ou bien utilisez l'inspecteur UI afin d'établir chaque point.

Etape 3. Synchroniser un curseur (par exemple l'opacité d'un calque) au LFO ou au séquenceur à pas.

Notes:

  • En plus des courbes de béziers,le LFO peut produire des ondes sinusoïdales, des nombres aléatoires et divers générateurs de nombres.
  • Les séquenceurs à pas font le suivi de plage de données inluant les indexes, nombres, couleurs, et booléens (ON/OFF)

Screen Shot 2012-09-28 at 1.36.17 PM.png

Plugins LFO et Séquenceur à pas.

Screen Shot 2012-09-28 at 1.51.08 PM.png

Les valeurs générées par les plugins  LFO et séquenceur à pas sont disponibles comme sources de données pour curseurs et autres éléments UI.

Screen Shot 2012-10-17 at 11.53.28 PM.png

Tant le séquenceur à pas et LFO peuvent avoir plusieurs pistes / ondes de donnés dans un seul plug-in.

Screen Shot 2012-09-28 at 2.03.04 PM.png

L'inspecteur du plugin LFO peut être utiliser pour ajouter,renommer, effacer et personnaliser d'autres paramètres de la forme d'onde individuelle.

Screen Shot 2012-10-17 at 11.48.20 PM.png

L'inspecteur du plugin Séquenceur à pas avec 4 pistes,un pour chaque type.Utlisez la colonne 'Nom' afin de changer les titres de chaque piste.


Recevoir un Time Code MIDI SMPTE (MTC) dans VDMX

 Télécharger le fichier de projet terminé échantillon de média pour ce tutoriel.

MIDI Time Code (MTC) est une spécification pour l'envoi de valeurs SMPTE  à partir d'une application maître  tels que QLab ou Apple Logic ,afin de garder le temps de lecture d'autres musique temps réels et logiciels visuels en synchro par le MIDI.Alors que MTC a quelques inconvénients,il peut être utile lors de mise en place de spectacle ou VDMX est utilisé conjointement avec un logiciel audio qui est susceptible de l'envoyer.

Dans ce tutoriel nous allons voir comment recevoir MTC dans VDMX de deux façons différentes:

Premièrement, l'exemple classique de la synchronisation de la position du temps d'un film QuickTime pour le timecode entrant.

Dans le second cas, au lieu d'un film traditionnel, un simple patch Quartz composer recevra le MTC et fera un rendu en temps réel de valeurs de sortie animés dans la norme SMPTE.

Puisque MTC peut être consulté n'importe où, aussi il peut être utilisé pour vérifier la position du temps dans le LFO et le séquenceur à pas,ou renvoyé au-dessus OSC à d'autres logiciels qui ne peuvent pas recevoir directement le timecode.

Screen Shot 2013-03-08 at 12.53.53 PM.png

Matériaux et prérequis:

  1. Télécharger QLAb, ou autre logiciel capable d'envoyer du MTC
  2. Lire le tutoriel sur la configuration MIDI au sein de VDMX

Etape 1: Envoyer "MTC" a partir de Qlab.

  1. Double cliquez ou faire glisser "MTC" dans la barre latérale pour ajouter une nouvelle indication"MTC Cue" au projet.
  2. Selectionnez le repére MTC et cliquer au dessous du paramètre "Settings" dans le menu de la destination MIDI ,Choisir"to VDMX".
  3. Si besoin est ,définissez le format SMPTE a la fréquence d'image de votre film.

 Une fois que le MTC Cue est mis en place dans QLab, commencer à envoyer le code temporel MIDI (MIDI Timecode) en cliquant sur le bouton "Load" puis sur le bouton "GO"

Screen Shot 2013-03-08 at 12.47.16 PM.png

QLab avec une indication active MTC "MTC Cue" envoyant "a VDMX"

Etape 2: Synchroniser les curseurs de VDMX à la source MTC.

Mise en place d'un curseur ou autre élément de l'interface utilisateur (UI) pour recevoir du MTC; fonctionne comme n'importe quelle autre source de données MIDI.Vous pouvez soit utiliser l'option "Detect MIDI" dans le mue clic-droit ou entrez le chemin d'adresse appropriée (Par exemple '/ MIDI / ch. 0/MTC') dans le fenêtre Inspecteur  de l'interface utilisateur comme indiqué ci-dessous.

Lorsque le MIDI Time code est reçu par un curseur dans VDMX ,il est interprété en nombre de secondes.Contrairement à d'autres données MIDI tels que la vélocité des notes ou des valeurs de contrôle, MTC peut ne pas avoir une valeur explicite  maximal en tant que tel,Les curseurs recevant MTC  tenteront automatiquement de fixer leur catégorie de valeur à cette valeur exacte en quelques secondes, ce qui différe du comportement normal du curseur par défaut qui consiste de mettre à l'échelle de données MIDI entrant à la valeur des enveloppes minimum et maximum à partir du curseur.

  Screen Shot 2013-03-08 at 1.10.58 PM.png

En Utilisant l'inspecteur de l'interface utilisateur "UI" afin de configurer manuellement  MTC réceptionnant sur le "Time Slider" du "Layer 1"

Screen Shot 2013-03-08 at 1.33.48 PM.png

Composition Quartz Composer avec curseur"MTC input" afin de régler l'heure SMPTE rendu.

 

Création de Presets Espace de travail.

L'objectif d'un Preset d'espace de travail est de permettre aux utilisateurs d'enregistrer un "snapshot" de leur espace de travail: par défaut, un preset crée par l'inspecteur Workspace permettra de sauvegarder l'état de chaque calque, plugin, et élément UI existant afin qu'ils puissent être recréé (déclenchée) plus tard.L'onglet "Presets" de l inspecteur workspace affiche l'interface pour créer,faire les mises à jours, faire la gestion, et restaurer ces presets de haut niveau.

Le fichier projet créer pour ce tutoriel peut être téléchargé ici.

Egalement dans cette série Un regard approfondi aux Presets Workspace (tutoriel vidéo + Notes) 

Etape 1. Créer une configuration que vous aurez envie de recharger plus tard. Utilisez l'inspecteur espace de travail pour ajouter des calques, des plugins,attribuer des sources de données, des effets et repositionner les fenêtres où ils seront le plus efficaces.

 Conseil: Si vous n'êtes pas sûr où commencer un nouveau projet, essayez d'utiliser l'un des exemples dans le menu Templates

Screen Shot 2012-10-09 at 2.53.46 AM.png Screen Shot 2012-10-09 at 2.52.22 AM.png Screen Shot 2012-10-09 at 2.57.27 AM.png Screen Shot 2012-10-09 at 2.57.19 AM.png

Etape 2. Passez à l'onglet "Presets" dans le "workspace inspecteur" puis cliquez sur le bouton "New"  pour créer un préréglage de votre configuration.

L'instantané nouvellement crée sera appelé "Preset 1" et apparaîtra dans la liste avec une vignette de la sortie.

Etape 3. Modifier l'espace de travail en ajoutant/enlevant des calques et plugins pour créer une nouvelle configuration, puis cliquer sur  le bouton "New" à nouveau

Conseil: Après avoir modifié votre configuration, vous pouvez utiliser le bouton "Update" pour écraser un préréglage existant dans votre espace de travail.

Etape 4: Cliquez sur les vignettes des Presets crées dans les étapes 2 et 3 pour restaurer leurs calques, plugins, et la disposition des fenêtres.

Conseil: Les sources de données y compris les presses du clavier et les notes MIDI peuvent être utilisés pour changer les préréglages spécifiques.

 

How to Turn an Old Building into an Interactive Driving Range by Gabe Shaughnessy & Dan Cohen

$
0
0

For this guest tutorial we're joined by Gabe Shaughnessy and Dan Cohen of Lumenal Code for an in depth look at how to create a well executed one off event video event that involves substantial preproduction from storyboarding, to animation and fabrication, and final live performance.

Red Bull Murals is a project that pairs an athlete with an artist in a unique collaboration. Red Bull asked New Creatures to create a psychedelic, immersive experience for pro golfer Rickie Fowler in Washington, DC’s historic Uline Arena. New Creatures asked Lumenal Code to provide a story, artwork, and animations, and then to create the interactive projection mapped targets and operate them during the event.

Part 1: Telling a story

 We came up with a storyline using the Hero’s Journey Archetype, because it is the foundation of so many myths. We looked for things that inspired us, that were psychedelic without being clichè, and that would make for a cohesive story. In the end, inspired by Jules Vern, Georges Méliès, and countless nautical illustrations, we decided the ocean’s mysterious qualities made the perfect setting and we set about storyboarding the experience.

 

RBstoryboard.jpg

Initial storyboard from start to finish

We knew we wanted to use a combination of projection mapping and lighting to create an immersive storyline in the space, and we wanted Rickie to shoot at several targets that would progress him through the story.

We traveled to DC to visit the location and captured a bunch of measurements and photos of the space. Using these photos, and advice from golfers about distance, height and hole size, we devised a layout for the elements in the room and started sketching out what they would look like. We spent extra time on the silhouette of each target so we would have a room filled with interesting shapes.

 

island-sketches.jpg

Silhouette versions of each target

Part 2: Fabricating the projection surfaces

Once we settled on a design for the targets, and a final shape for each, we created scaled vector illustrations and sent them to a shop in DC to have them cut out of plywood and painted. Rather than paint them white, our technical director, Grant Davis, advised us to use 50% gray so we would have more depth in our shadows and better contrast overall.

Screen-Shot-2013-06-08-at-3.07.38-PM.png Screen-Shot-2013-06-15-at-5.17.21-PM.png

Part 3: Illustrating and animating the story

After the vectors were sent off we finished the digital illustrations so we had a base painting of each target to use in the animations. The illustrations were created in Photoshop, then imported into AfterEffects for animation. We used a combination of cell animation, puppet tool, particle effects, Lux, Tsunami and a handful of other plugins to create the animations. These were all exported in mpeg4 format for sharing with the rest of the team.

turtle-head-concepts.jpg

Turtle head concept sketches

Screen-Shot-2013-06-15-at-8.12.15-PM.png

Turtle head digital drawing

Syncing audio to the animation:

We sent the finished animations to Anthony Olander, an audio producer here in Portland, Oregon. He dropped the animation clips into Ableton Live and came up with sounds to match each clip, then sent the audio files back to us.

Once we had the sound effects, we added them to our AfterEffects files and re-exported everything in the HAP Alpha codec so we could assemble it in VDMX.

 Animating the Moon: 

We worked with Aaron Rogosin to voice and animate the moon. After scripting out the moon’s lines, we filmed Aaron speaking them, then motion tracked the movements of his face as he talked. We  mapped the video to a high-res photo of the moon we got from NASA. Then we split the lines into single clips we could trigger with a TouchOSC soundboard and exported them as HAP Alpha.

 

Screen-Shot-2013-06-18-at-12.33.58-PM.png Screen-Shot-2013-06-20-at-6.31.44-PM.png moonface.jpg

Part 4: Setting up the software for mixing and mapping

Controlling a non-linear storyline line in VDMX

We had a basic progression for the story, but we didn’t know how the actual event would go down – would Rickie hit each target on the first try, or would it take him all night? Because of this, we had to make the storyline flexible. VDMX was the perfect solution for this, it allowed us to set up scenes (presets) and trigger them with OSC signals. We used TouchOSC to build our control surfaces. The controls were well labeled and large enough so we couldn’t miss. We set up two iPads, one to control the scenes and progression of the story, the other just controlled the moon. Each iPad used a different page of the same TouchOSC layout, which made updating and switching control around a lot easier.

Screen-Shot-2013-08-29-at-9.43.30-AM.png Screen-Shot-2013-08-29-at-9.43.15-AM.png Screen-Shot-2013-06-27-at-6.59.10-PM.png

Sliders and buttons from TouchOSC mapped to clips and playback controls in VDMX

We did this project in May, before the new TouchOSC layout import tool was available in VDMX. If I did this today, I would be using that tool instead of the method I show here, but I would still be sending the control surface button and slider values out via MIDI – I’ll explain why next.

Syncing VDMX with the GRANDMA2 lighting controller: To sync with the GrandMA2 lighting controller we used MIDI notes. The VDMX control surface elements sent midi notes to the controller for the different triggers. The notes would trigger lighting elements that had been preprogrammed to match the projections.

Controlling multiple computers in sync: To control the second additional computers, I set up an additional OSC output for each button on the VDMX control surface. This relayed to the next computer and triggered a control surface element in VDMX. The VDMX project files running on the two machines were nearly identical, but the OSC preferences for each had different input and output settings. This ensures a one-way relay from one computer to the next. The VDMX Comm Display plugin is invaluable for setting this up because it shows you all the OSC messages flying around.

Using Syphon and MadMapper: Each layer’s output was sent out via Syphon with the VMDX Syphon plugin. For each layer, I opened a separate instance of MadMapper and used the Syphon input. Greg and Grant showed me a helpful trick with MadMapper – you can install (and run) multiple instances of it – just install each instance in its own directory.

Mapping to the surfaces: Because Lumenal Code is based out of Portland, and the project was in DC, I had to do all the compositing in VDMX and MadMapper using photographs of the space. I used a photo of the room as a backdrop in MadMapper, using the technique shown here. Once we got in the space with the projectors on, one person stood next to the projection surface and we used a radio to communicate the adjustments to the mapping.

Part 5: The Big Day

We rehearsed with several other golfers in the week leading up to the actual event. None of them were able to get the ball in the targets, and we were starting to get a little worried that we made the challenges too hard. Rickie showed up and just before sunset with no idea what was in the warehouse. He did a short interview in the parking lot, then came into a dark room, with a foggy, dry-ice haze covering the floor. We had a crew of about twelve people on radio headsets, coordinating different elements of the experience, all hidden away in the darkness.

Fortunately, we made the challenges just difficult enough that Ricky was able to hit the targets. It took him a few tries to hit each one, but after trying a couple clubs and dialing in his shots, he eventually able to nail each one and complete the story.

More photos and video at http://www.augmentedart.com/

How to make a customized Livid OhmRGB Slim template by eatyourwork

$
0
0

Download the idropt VDMX template project for the Livid OhmRGB Slim.

For today's guest post we're joined by eatyourwork who first introduced us to the possibilities of using OhmRGB Slim alongside of VDMX in a blog post a couple of years ago.

 Since then we've made a few basic templates for new video performers to get started with a simple VJ video mixer setup with the Ohm, but in this video tutorial Simas shows off the extent to which you can customize your layout and MIDI mapping when making your own video performance rig.

Along with using pads for triggering of video clips, this template makes use the echo mode for MIDI talkback setup for lighting up the Ohm's LEDs in different colors to reflect the states of FX being applied to each layer.

After trying out this template for yourself, check out some more tutorials to learn how to customize the setup even further. 


Notes:

Performing visuals with OhmRGB & VDMX.

Performing visuals with OhmRGB & VDMX.

Paper sketch outlining custom MIDI mappings.

Paper sketch outlining custom MIDI mappings.

Screenshot of idropt template.

Screenshot of idropt template.

OhmRGB with idropt template.

OhmRGB with idropt template.

Behind The Scenes Making Of Grattacielo Pirelli Projection Mapping by Recipient.cc

$
0
0

For this guest tutorial we're joined by recipient.cc who give us a behind the scenes look at the techniques used in their recent projection mapping on the Pirelli Tower in Milan:

At the launch of Adidas Boost in Italy, each of 9000 square meters of the facade came to life, resulting in a show of monumental proportions. The building was shaken by vibrations of energy, bent, broken, torn, and even brought down.

 

Phase 1: Pre-production

Photographs and technical data are used to create a 3d model of the skyscraper.

Screen Shot 2013-09-25 at 12.21.41 PM.png Screen Shot 2013-09-25 at 12.23.07 PM.png Screen Shot 2013-09-25 at 12.22.22 PM.png

Phase 2: Video Production and Sound Design

Scenes to be projected are created and prepared for live playback with VDMX, Cinema 4D, Quartz Composer, Logic, Live and other software tools.

BN5B0037 copy.jpg Screen Shot 2013-09-25 at 12.27.48 PM.png Screen Shot 2013-09-25 at 12.28.12 PM.png

Phase 3: Implementation

Scaffolding is built to hold the twelve 20000 ANSI lumens projectors needed to cover the entire face of the tower.

Screen Shot 2013-09-25 at 12.39.25 PM.png Screen Shot 2013-09-25 at 12.39.49 PM.png Screen Shot 2013-09-25 at 12.40.31 PM.png

Video Fundamentals – Part 1 – General Workflow

$
0
0

Since the first introduction of our new blog almost a year ago the number of topics covered has grown to include a wide variety of subjects to the point you could almost prepare an entire course on VJing and live video production from our collection of tutorials.

With this in mind we've started to put together a new “Introduction to Video Fundamentals” curriculum that focuses on teaching the basic knowledge needed to get started with working with visuals. While these ideas are usually demonstrated with VDMX a lot of the underlying techniques are generally translatable to other software and hardware used for video and lighting production.

In part two of this series (stay tuned!) we'll start to look at some more advanced topics for putting all of this together for live performances along with some of our case studies and tips from the pros in the field.

Using VDMX to demonstrate basic chroma key techniques.

Using VDMX to demonstrate basic chroma key techniques.


Lesson 1.1 – Live Cameras, Interactive Video Generators, and Pre-Rendered Movies

Typically the original source material used in video production falls into one of three categories:

  1. Live inputs such as video cameraspublished Syphon feeds and window captures from other applications.
  2. Interactive video generators such as Flash, Quartz Composer, and CoreImage.
  3. Pre-rendered movies (and still images) from cameras or exported from animation software.

Case Study: The ECLECTIC METHOD REMIX, Part One - Making Loops

Tips & Notes:

Assignment:

  1. Download a set of sample clips  to use with the VDMX simple player template.

Challenge:

  1. Create a set of your own A/V loops using a camera or other captured footage.

Lesson 1.2 – Mixing and Applying Visual FX to Video Sources

 

Whether working in a real-time output environment like VDMX or a non-linear editor like Final Cut Pro, the technique used for digitally processing video source materials is essentially the same:

  1. Apply image filters to change the look and style of individual video sources and layer groups.
  2. Mix and combine multiple video sources by adjusting the size, crop, and composition mode.

 

Case Study: The ECLECTIC METHOD REMIX, Part Two - Jamming 

Tips & Notes: 

  • Layer masks adjust the opacity of individual pixels during composition to obscure parts of an image.
  • Preview windows in VDMX  can be set to view layers both before and after FX are applied.
  • New custom FX and composition modes can be created using Quartz Composer and other popular formats.

 

 

Assignments:

  1. Mix, and fade between sources with the example Four Channel Mixer template
  2. Practice removing a solid color background with a chromakey effect.

Challenge: 

  1. Make a pre-rendered animation that makes use of alpha channels.

Lesson 1.3 – Publishing Video

Once video sources are processed and composited together, the next step is to output the result:

  1.  Fullscreen outputs, such as projectors, displays and LED walls connected to a monitor port.
  2. Syphon outputs, to send video feeds to other applications for specialized image processing.
  3. Recorded to disk as a movie or still file that can be shared online or further remixed.

Case Study: How to do the Deadmau5 End of Year Tour visuals by Momo the Monster 

Tips & Notes: 

Assignments: 

  1. Apply an FX to only part of a layer using a mask.
  2. Create and share online a short demo reel  from what you've learned so far.

Challenge: 

  1. Make a 3x3 video wall using live sampled video clips.

Lesson 1.4 – Instruments and Automation

Pretty much every parameter in VDMX can be controlled by external hardware or automated with data generator plugins:

  1. MIDI, OSC, and DMX (ArtNet) are common protocols for creating networks and connecting physical instruments to computers.
  2.  Step Sequencers and LFOs generate control values with oscillators and looping patterns.
  3. Audio Analysis and Beat Detection algorithms listen to music from sound inputs to sync visuals.

Case Study: The ECLECTIC METHOD REMIX, Part Three - Working with Ableton Live 

Tips & Notes: 

 

Assignments:

  1. Use audio analysis (or another data-source) to trigger video clips.
  2. Create a virtual layout of a MIDI instrument with the Control Surface plugin.

Challenge: 

  1. Sync the playback of movies on two computers across a network with OSC.

Creating a multi-channel live camera video sampler

$
0
0

For this technique video tutorial we'll be looking at how to use VDMX to create a multi-camera video sampler setup with the ability to record movie clips from a live feed to be immediately remixed and saved for later editing.  As movie clips are sampled they will be automatically added to the bin page where they can be triggered for output making this simple example useful either on its own for video production, or as part of a larger VJ style live performance rig.

To begin building this setup we start by adding one of our camera inputs to the Media Bin, followed by setting up Preview Window and Movie Recorder plugins to handle view / capture for the live feed.

Once the single stream version is ready we can go about repeating this process for each of the other inputs as needed along with additional layers if multiple streams are to be output at the same time.

After completing this tutorial try using Audio Analysis or Face Detection to trigger the sampled movies. 

Tips and Notes:

Adjusting camera and digitizer settings in the “VidIn” tab of the Workspace Inspector.

Adjusting camera and digitizer settings in the “VidIn” tab of the Workspace Inspector.

Customize the layout of the preview, recorder, and media bin for each input.

Customize the layout of the preview, recorder, and media bin for each input.

Right-click on clips and choose "Reveal Selected Files In Finder" to locate recordings.

Right-click on clips and choose "Reveal Selected Files In Finder" to locate recordings.

Using a mask to apply an FX to only part of a layer

$
0
0

In this guest tutorial we're joined by the Rockwell Group's LAB division who work as an interactive design team within the larger architecture firm where they focus on projects that blend physical and virtual spaces.

For a recent projection mapping installation in NYC, one of the techniques used by the LAB was to apply a real-time video effect on to a specific portion of one of the pre-rendered movies such that part of the image was left unprocessed in the main output while another section was color shifted to match the lighting effects in the room.

To begin, two movies are loaded into VDMX – the first is the original media of the complete statue and the second is a black and white movie containing just the robes. Next, the Layer Mask FX is used with the black and white stream as its source to describe where the hue shift should be applied.

Another handy trick in this setup is that the original content is sideways – this is because the actual projector has been rotated, set on its side to get additional vertical resolution and uses perspective correction to compensate for the projection angle


Tips and Notes:

Preview windows showing each layer and how they are combined for the final output.

Preview windows showing each layer and how they are combined for the final output.

Sync the playhead of the masking layer source to the movie layer time.

Sync the playhead of the masking layer source to the movie layer time.

Set the background to receive the movie layer output before any FX are applied.

Set the background to receive the movie layer output before any FX are applied.

Video fundamentals in VDMX – Part 2 – Media Types

$
0
0

In the first part of this series we opened by looking at a general overview of the start-to-finish workflow that is typical to VJing and live video production, beginning with the types of source materials which then get processed by visual FX, composited with other images, and finally output as projection or a rendered out file to share online.

For part two of this series we'll be looking more in depth at the four main types of video sources that you'll encounter.

Window Inputs in VDMX can capture and mix the output from other applications.

Window Inputs in VDMX can capture and mix the output from other applications.


Movies, still images and other pre-rendered media types are stored in files on a disk. Inside of these files the image data is often compressed into a format known as a codec that can be read back by a computer. Once a movie is created, the pixel dimensions of each of its video frame (called the movie resolution) becomes fixed.

Whether you are using VDMX or another VJ app or media server, you can expect to find standard playback options such as setting the rate, loop-mode, in / out points, and volume level of each movie file.

In most cases movie files can be loaded by dragging them from the Finder or using a built-in media loading window into your media bin or playlist window.

Case Study:  How to Turn an Old Building into an Interactive Driving Range by Gabe Shaughnessy & Dan Cohen.

Tips & Notes:

  • Standard movie playback controls include settings such as rate, current time, and volume adjustment. 
  • For the best quality, use your original uncompressed source movies files when exporting to a new format such as PhotoJPEG or Hap.
  • The Hap Alpha video codec can be used to include a transparency layer (alpha channel) in a movie file.
  • Share and download royalty free video clips with other artists on websites such as videopong and archive.org
  • The most common resolution standards used for displaying video are 640x480 (Standard Definition), 1280x720 (HD 720p), and 1920x1080 (HD 1080p) so typically it is a good idea to create movie files at these sizes.

Assignment: Watch working with panoramic footage shot with the Kogeto dot

Challenge: Make your own custom movie player configuration


Live inputs are cameras and other video feeds that are captured using special hardware connected to your computer. These can include web-cameras (built-in or connected over USB/Firewire), DV over Firewire, as well as higher end Thunderbolt and PCI based HDMI / SDI / HD-SDI capture devices. In most cases VDMX can work with as many number of live inputs as the hardware you are using can support.

Within VDMX the setup for video inputs takes place in the Vid In section of the Workspace Inspector window. Inputs can also be directly accessed from the Layer Source picker or by adding them to pages as clips to be triggered like any other media type.

In this demonstration we'll quickly set up VDMX as a switcher between a built-in web camera and an HD camera connected by a Blackmagic UltraStudio MiniRecorder.

Case Study: Creating Video Feedback Loops.

Tips & Notes:

  • Use the Preview Window plugin to watch live camera feeds before they are visible in the main output.
  • The Movie Recorder plugin can be used to capture samples from a camera feed for live remixing or later editing.
  • Similar to movie files, the most common resolutions for video capture hardware are 640x480 (Standard Definition), 1280x720 (HD 720p), and 1920x1080 (HD 1080p), but you may encounter others with webcams.

Assignment: Making a multi-camera live video sampler.


Interactive generators such as Quartz Composer, CoreImage, FreeFrameGL, and Flash are media file types which contain instructions for creating video streams.

Many of these formats allow for custom input parameters that can be adjusted during performance and immediately taken into effect by the generator patch. This differs from using regular movie files which use a dedicate set of controls for playback.

In this example we'll be using Quartz Composer which is a developer tool from Apple that can be used on its own, as well as a variety of different ways within host applications.

Case Study: The Quartz Composer Valentines Day Example.

Tips & Notes:

Assignment: Minuek's introduction to making Quartz Composer sources.

Challenge: Make a custom text generator composition for VDMX.


Syphon is new standard protocol for video environments on a Mac such as VDMX, Jitter, and Processing to efficiently share image streams between applications. During a performance using Syphon sources is similar to using live inputs as sources in many ways, only instead of camera feeds the video frames are provided by other software. It can also be useful to send the output from VDMX to tools for specialized mapping or display.

To receive Syphon feeds within VDMX, use the pop-up menu in the Layer Source Controls panel, or add them to the media bin pages as clips that can be triggered like any other media type.

Within VDMX any number of layers and groups can be sent to other applications with the Syphon Output plugin.

For this example we'll demonstrate both receiving (clients) and publishing (servers) from VDMX alongside the two basic sample Syphon applications.

Case Study: Connecting Unity to VDMX by Syphon.

Tips & Notes: 

  • Read more about Syphon and other supported applications at http://syphon.v002.info/
  • To send the 'Main Output / Canvas' from VDMX to other applications by Syphon, disable the 'Skip Canvas Rendering' option in the VDMX preferences.
  • Syphon feeds can include alpha channels for layer transparency.
  • Use the 'Window Capture' option for applications that do not support Syphon natively.

Assignment: Use the “Simple Server” example Syphon test application with VDMX.

Challenge: Using the VDMX Window Capture with WebGL in Google Chrome.

Installing and using FreeFrame Generators and FX

$
0
0

Among the real-time video generator and FX formats is an open source plugin type called FreeFrame and the newer GPU based FFGL format which uses the graphics card for faster image processing. In this tutorial we'll look at how to install these 3rd party FreeFrame plugins to use with VDMX.

To begin, find some FreeFrame plugins to try out. One place to start is the searchable FreeFrame database.

Once you've downloaded some FreeFrame or FFGL plugins to your Mac, to use them within VDMX simply place them in the "/Library/Graphics/Free Frame Plug-Ins/" directory on your Mac.

FreeFrame and FFGL generators can be found in the "Use Source" menu in the Layer Source window, or added to a media bin page from the Import Media window.

To apply a FreeFrame based FX to a layer, select it from its sub-category in the "Load Asset" pop-up menu in the Layer FX window.


Notes:

FreeFrame generators can be loaded from the Layer Source menu or added to 
pages with the Media Browser.

FreeFrame generators can be loaded from the Layer Source menu or added to pages with the Media Browser.

Apply FreeFrame and FFGL FX to a layer from their sub-category in the Layer 
FX "Load Asset" pop-up menu.

Apply FreeFrame and FFGL FX to a layer from their sub-category in the Layer FX "Load Asset" pop-up menu.

Overview of Available Data-sources in VDMX

$
0
0

A big part of using VDMX is taking advantage of the ability to automate any of the standard interface controls (sliders, buttons, color pickers, pop-up menus) which are used to control everything including things like the opacity of a layer, the volume or rate of a movie, the intensity of a blurring FX being applied, or even the settings of any of the automation plugins themselves.

For this quick reference tutorial we'll look at all of the available data-sources that VDMX publishes internally for automating controls. These can all be used on their own, or together, and in some cases you may have multiple providers of each type.

When finished, read about hardware interfaces such as WiiMotes and MIDI / OSC / DMX based instruments.

Using the UI Inspector to set the data-source for the "Layer Opacity" 
slider to the "Mouse" position.

Using the UI Inspector to set the data-source for the "Layer Opacity" slider to the "Mouse" position.


Data-Source Plugins:

hese plugins are designed specifically for the task of generating control data to automate the behaviors of interface items such as sliders, buttons, and pop-up menus within VDMX. Each of these plugins can be customized in a variety of ways and you can have as many instances of each as needed running at the same time.

Audio Analysis, Clock, Control Surface, LFO and Step Sequencer can be added 
from the Workspace Inspector under the Plugins tab.

Audio Analysis, Clock, Control Surface, LFO and Step Sequencer can be added from the Workspace Inspector under the Plugins tab.

  1. Audio Analysis: Converts a sound input from a microphone or line input into control data.
  2. Clock: Generates time based sync signals for a given BPM or duration, including automatic beat-sync with music.
  3. Control Surface: Make custom layouts of interface items including sliders, buttons, and color pickers.
  4. LFO: A simple function generator for publishing multiple sine, cosine, random and similar math waves in sync.
  5. Step Sequencer: Create patterns and sequences of control values that playback over time.

Layer Data-Sources:

For each layer in your setup these data-sources will be available and update to match the settings of the layer itself.

The Movie Time and Opacity values are published for each layer in your 
setup to sync with other parameters.

The Movie Time and Opacity values are published for each layer in your setup to sync with other parameters.

  1. Layer Opacity: The transparency value of each layer set by the 'Opacity' slider in the Layer Composition window.
  2. Movie Time: The current playhead position of a movie playing on each layer, available as normalized (0.0 to 1.0) and non-normalized (0 to movie duration in seconds).

Other Useful Data-Sources:

Along with their normal capabilities, several standard plugins provide data-sources for keeping other parts of your setup in sync with their current state.

  1. Media Bin: When clips are triggered, the 'Index' of the clip on the current page is published.
  2. Mouse Position: Always available, the normalized x/y position of the cursor on screen.
  3. Preview Window: The position of any mouse clicks and drags on each Preview Window are published.
  4. Two Channel Mixer: In addition to directly controlling opacity of its target layers the 'Crossfade Position' can be directly used as a data-source.

Quartz Composer based Data-Sources:

VDMX also allows for creating new data-sources using Quartz Composer compositions loaded as custom plugins using published outputs from Input Splitter object. For more information see the page on Making Your Own Data-Providers, or see the Face Tracking FX Tutorial for an example of this in action.

Creating and Installing ISF Generators

$
0
0

Download the example project and ISF generators for this tutorial.

Also in this series see the introduction to making your first ISF video FX.

One of the most powerful capabilities of modern GPUs is the ability to run specialized code known as GLSL shaders which can be used to create computer generated video streams whose control parameters can be manipulated in real-time.

An ISF, or “Interactive Shader Format” file is a GLSL fragment shader (.fs) that includes a small blob of information that describes any input controls that the host application (such as slider, button, and color picker controls in VDMX) should provide for the user when the generator is loaded for use, as well as other meta-data including the authorship, category and a description.

In this two part tutorial we'll cover the basics of using ISF generators within VDMX as sources for layers and how to install new example ISF files you may download from the Internet, followed by a quick introduction to creating your own GLSL fragment shaders.

A layer in VDMX rendering a basic sine wave ISF Generator.

A layer in VDMX rendering a basic sine wave ISF Generator.


Loading ISF Generators into VDMX

Loading ISF Generators into VDMX is essentially the same as when using other disk based files such as movies or Quartz Composer compositions. To add an ISF file to your VDMX project, simply drag the .fs file from the Finder or Media Browser window onto a page in a Media Bin plugin. To trigger the file to a layer, click on its thumbnail.

Like other built-in sources, any ISF generators that are installed system-wide can be directly accessed using the “Source Picker” menu in the Layer Source window, or accessed from the Media Browser window, under the ISF Sources category.

If you've downloaded ISF Generators or FX from a website such as the VDMX forums, you can install them to use as built-in sources by copying them to the: “/Library/Graphics/ISF” directory.


Making ISF Video Generators

For the second half of this tutorial let's now take a look at how the 'sine fill' example ISF file was created and ways we can modify it to have additional parameters. To begin, open the file in your text editor of choice – if you aren't sure which to use, TextEdit, TextMate, BBEdit, and SublimeText are great options.

Each ISF file is broken up into two sections: At the top, a commented out section written in a notation called JSON that describes the file. Below that is our GLSL code that does the actual drawing.

Viewing the file in a text editor you can see we've set up the math for the drawing to set each pixel below the curve to solid white (1,1,1,1), and any pixel above the curve to transparent black (0,0,0,0). The top section contains the inputs for the number of cycles and offset are declared as published variables that will show up as interface controls in VDMX.

Tip: You can modify existing ISF files to change its behavior or publish new variables for input controls.


Notes:

Built-in ISF sources are located in the “/Library/Graphics/ISF” directory.

Built-in ISF sources are located in the “/Library/Graphics/ISF” directory.

The ISF Specification page includes detailed information and tips.

The ISF Specification page includes detailed information and tips.

Download the ISF test filters collection as a development reference.

Download the ISF test filters collection as a development reference.

Creating and Installing ISF FX

$
0
0

Download the example project and ISF FX for this tutorial and the collection of reference filters.

Also in this series visit the introduction tutorial on making your first ISF video generator.

One of the most powerful capabilities of modern GPUs is the ability to run specialized code known as GLSL shaders which can be used to create real-time video filters that can be applied to movies, live cameras, and other computer generated feeds.

An ISF, or “Interactive Shader Format” file is a GLSL fragment shader (.fs) that includes a small blob of information that describes any input controls that the host application (such as slider, button, and color picker controls in VDMX) should provide for the user when the FX is loaded for use, as well as other meta-data including the authorship, category and a description.

Example ISF “Color Multiply” FX applied to a VDMX layer.

Example ISF “Color Multiply” FX applied to a VDMX layer.

In this two part tutorial we'll cover the basics of applying ISF based FX to layers in VDMX and how to install new example ISF files you may download from the Internet, followed by a quick introduction to creating your own image processing GLSL fragment shaders.

Tip: You can modify existing ISF files to change its behavior or publish new variables for input controls.


Applying ISF FX to Layers in VDMX

Using ISF based FX in VDMX is essentially the same as using other formats such as CoreImage, Quartz Composer and FreeFrame. Like other FX types each individual filter will have special input controls that are specific to the filter that show up as interface items such as sliders, buttons and color pickers.

If you've downloaded ISF FX from a website such as the VDMX forums, you can install them by copying them to the: “/Library/Graphics/ISF” directory or in the VDMX application support folder under ISF.

Once installed the FX will show up under the appropriate category in the Layer FX window under the “Load Asset” pop-up menu.

Tip: Many standard FX are available in multiple formats, the ones that are ISF can be identified by the .fs extension at the end of its name.


Creating and Installing ISF FX

For the second half of this tutorial let's now take a look at how the ‘Color Multiply’ example ISF FX was created and ways we can modify it to have additional parameters. To begin, open the file in your text editor of choice – if you aren't sure which to use, TextEdit, TextMate, BBEdit, and SublimeText are great options.

Each ISF file is broken up into two sections: At the top, a commented out section written in a notation called JSON that describes the file, including its FX category and the "inputImage" variable that will deliver the incoming video frame to be processed. Below that is our GLSL code that handles the processing for each pixel in the input image.

Viewing the file in a text editor you can see we've set up the math to simply multiply the RGBA color value of each pixel by the input color. The top section contains the inputs for the image and color are declared as published variables for VDMX to access and provide interface controls for.

More details about making FX and examples can be found on the ISF Specification page.


Notes:

Place your ISF FX in the “/Library/Graphics/ISF” folder.

Place your ISF FX in the “/Library/Graphics/ISF” folder.

Use the “Load Asset” pop-up menu to apply ISF FX to a layer in VDMX.

Use the “Load Asset” pop-up menu to apply ISF FX to a layer in VDMX.

Use a published "image" input named "inputImage" to make an FX.

Use a published "image" input named "inputImage" to make an FX.


Introduction tutorials translated to French Part 2

$
0
0

Thanks to Jehan Fillat for these translations!

Also read the first round of French translated tutorials.


La compositions de calques qui couvre les modes d’opacité et de fusion

La source de ce tutoriel peut être téléchargée ici.

Étape 1: Dans le « Workspace Inspector » (Inspecteur de l’Espace de Travail), créez deux calques au minimum. Sélectionnez le premier calque pour l’inspecter et modifier ses contrôles de composition.

Astuce: Pour ouvrir les contrôles de composition pour le calque dans sa propre fenêtre, cliquez sur le bouton « Expand » (Développer).

Étape 2: Ajustez le mode de composition et les paramètres d’opacité afin de contrôler la manière dont le calque sera rendu sur le calque inférieur.

Astuce: Les modes de fusion OpenGL sont plus rapides que les modes de composition CoreImage et Quartz Composition. Utilisez-les si possible.

Changer le mode de fusion du calque princpal.

Changer le mode de fusion du calque princpal.

Étape 3: Cliquez sur les onglets « Size & Pos » (Taille & Pos.) et « Rect », puis, changez les dimensions du calque en utilisant les curseurs « Width » (Largeur), « Height » (Hauteur), X et Y.

Étape 4: Cliquez sur l’onglet « Crop » (Recadrer) pour modifier le nombre de pixels rognés du haut, du bas, de la gauche, et de la droite d’image source en utilisant les curseurs disponibles.

Une configuration à trois calques utilisant les modes de fusion, ainsi que 
les contrôles d’opacité, de taille, de position, et de recadrage.

Une configuration à trois calques utilisant les modes de fusion, ainsi que les contrôles d’opacité, de taille, de position, et de recadrage.


Appliquer un masque sur un calque (layer)

Pré-requis :

  1. Lire le tutoriel sur la compositions de calques qui couvre les modes d’opacité et de fusion
  2. Lire le tutorial sur l’ajout d’un effet sur un calque

La source de ce tutoriel peut être téléchargée ici.

Étape 1: Dans le « Workspace Inspector » (Inspecteur de l’Espace de Travail), créez un calque supplémentaire, inspectez le, et cachez le en mettant l’opacité à 0 ou en cliquant sur le bouton « Hide » (Masquer).

Astuce: Dans les contrôles « Layer Source » (Source du calque), choisissez une source intégrée telle que CoreImage Checkboard ou Quartz Composer Gradient afin de créer une image-masque de test.

Notez le bouton d’état Show/Hide (Afficher/Cacher) dans les contrôles de 
composition du calque pour ce calque caché ‘Mask Layer’ (Calque de Masque)

Notez le bouton d’état Show/Hide (Afficher/Cacher) dans les contrôles de composition du calque pour ce calque caché ‘Mask Layer’ (Calque de Masque)

Étape 2: Ajoutez l’effet « Layer Mask » (Masque du calque) au calque qui sera dissimulé (le calque supérieur), et dans le menu pop-up « maskImage », sélectionnez le flux vidéo du calque caché.

Astuce: Le « Layer Mask » (Masque du calque) est situé dans la catégorie « Color Effect » (Effets de couleurs).

Astuce: Configurez le mode de composition du calque de premier plan sur « OpenGL Over » ou « Source-Atop-W » pour de meilleurs résultats.

Prévisualisations des calques de Premier plan et de Masquage aux 
différentes étapes du traitement et le rendu final.

Prévisualisations des calques de Premier plan et de Masquage aux différentes étapes du traitement et le rendu final.

Applying Math Expressions to Slider Receivers

$
0
0

Related: Syncing a Slider to a Data-Source and Introduction to MIDI/OSC/DMX Setup

Using control data from other applications, external hardware, and internal providers like LFOs or Audio Analysis plugin is a major component of VDMX- every UI element (eg. sliders and buttons) is capable of being controlled by MIDI/OSC/DMX and other data sources. The procedure for the setup is consistent across all interface control items- you add a receiver (which receives data from things) to the UI element you're working with using the UI Inspector. Once the assignment is made the sub-inspector for data-receivers contains several options for how the data is mapped onto the UI item being controlled.

Typically the range of numbers being received can be automatically translated by VDMX to cover the local minimum and maximum envelopes of a slider UI item. However in some situations you may want to override the default number mapping behavior by using the settings in the sub-inspector panel for receivers. In this tutorial we'll look at some of the common cases you may run into and how to handle them when making your own VJ rig or video instruments.


Notes and Tips:

Decimal receivers take an incoming numeric value, optionally modifying it via a series of math functions, and apply it to whatever owns the receiver:

  • "Invert val?" inverts the received value.
  • "Scale val to fit within envelopes" takes the received value and scales it to fit within the min/max envelopes of whatever owns the receiver (usually a slider). This lets you use the full range of a hardware control to manipulate a specific range of a slider.
  • When the "Skip duplicate received values" option is enabled the UI item will not output if the incoming values have not changed.
  • The "Use Num FX Chain?" toggle determines whether or not this receiver uses a number fx chain to modify the received data before passing it on to the object which owns the receiver.
  • The "Edit Num FX Chain" button uninspects the receiver and its attached UI item and inspects the receiver's num fx chain, allowing you to edit and add to it.
  • The middle of the Decimal Data Receiver Inspector changes depending on whether the receiver is receiving data from a MIDI, OSC, or DMX data source.
    • When receiving from MIDI, there's an option to work with "Endless MIDI mode"- some MIDI devices send increment/decrement messages when turning infinite knobs: this is how you work with this data
    • When receiving from OSC/data sources, there's a toggle for choosing whether or not the receiver's values are "normalized". By default, all receivers are expected to be normalized (the values are expected to be ranged from 0.0-1.0): if you disable this, the receiver will send its values outside this range to whatever owns it. This is particularly useful if you want to control the exact value of a slider directly.
    • When receiving from OSC/data sources, a text field becomes visible which allows you to choose which item in the list you want the receiver to work with. Some OSC devices and software encodes values in lists- this lets you work around such contrivances and extract the data you need from them.
  • The "Do Math?" option reveals a text field which can be used to enter a custom mathematical expression that is applied to the receiver before output.
    • Use the $VAL, $MIN and $MAX strings to access the slider value, minimum and maximum envelope values as variables in your equations.
    • For a full list of supported mathematical functions and constants visit the wiki for DDMathParser by Dave Delong.

Click on a slider to inspect it in the UI Inspector. Use the '+' button to create a new receiver.

Click on a slider to inspect it in the UI Inspector. Use the '+' button to create a new receiver.

Receivers can adjust the slider value, or its min / max envelope handles positions.

Receivers can adjust the slider value, or its min / max envelope handles positions.

The "Do Math" option in the receiver sub-inspector allows for entering math expressions.

The "Do Math" option in the receiver sub-inspector allows for entering math expressions.

Setting A Movie to Resume Last Playback Position

$
0
0

Download the completed project file and sample clips for this tutorial.

When preparing in advance to perform visuals using movie files, one of the more subtle controls you may want to customize on a per clip basis is the start point of individual clips when they are triggered. While the default behavior is to playback from the first frame of the video, it may be necessary to have a clip resume playing from the last time it was used. This is accomplished in VDMX by using the Files section of the Workspace Inspector where you can specify custom playback behaviors of individual clips.

To begin, on the left side select the page that contains the clip you want to inspect, then choose it from the list on the right. 

In the sub-inspector we're looking for the "Movie Start Time" attribute. When this is enabled the clip will resume playback from its last play time when it is next triggered.


Notes:

Multiple clips can be inspected simultaneously in the File Inspector.

Multiple clips can be inspected simultaneously in the File Inspector.

Enable the "Movie Start Time" attribute to resume movie playback.

Enable the "Movie Start Time" attribute to resume movie playback.

Files can also be inspected by right+click in the media bin.

Files can also be inspected by right+click in the media bin.

Connecting the x-OSC wireless I/O board to VDMX

$
0
0
The ‘Compendium’ video object.

The ‘Compendium’ video object.

For this guest tutorial we are joined by Will Reardon, a motion designer and artist, currently developing video based art objects. As an extension of this next project Will is using VDMX and the x-OSC I/O board to create a device similar to his previous ‘Compendium’ object but with added interactive functionality for controlling its animations.

We'll begin by making a basic test connection between the software and hardware over WiFi, then start to add a series of sliders and knobs to VDMX that will receive our values from the I/O board analog inputs.

The goal for the finished controller object will have knobs, switches and sliders connected to VDMX via x-OSC which will control various aspects of an animation on the screen, along with a custom housing or box.

Notes and tips:

Hardware used for this tutorial: an old potentiometer (pot) connected to x-OSC with 3 jumper wires and 3 crocodile clips.

Hardware used for this tutorial: an old potentiometer (pot) connected to x-OSC with 3 jumper wires and 3 crocodile clips.

Switches, sliders and pots generally have 3 connectors on the back. The middle one is typically the variable output. The other two are power and ground.

Switches, sliders and pots generally have 3 connectors on the back. The middle one is typically the variable output. The other two are power and ground.

x-OSC connects wirelessly to the computer over a router or ad-hoc computer wifi network. The device can be can be powered by USB or battery pack.

x-OSC connects wirelessly to the computer over a router or ad-hoc computer wifi network. The device can be can be powered by USB or battery pack.

In the VDMX OSC preferences add an input to receive messages on port 8000. 

In the VDMX OSC preferences add an input to receive messages on port 8000. 

x-OSC also features on-board gyroscope, accelerometer, magnetometer sensors. Device settings can be configured from any web browser.

x-OSC also features on-board gyroscope, accelerometer, magnetometer sensors. Device settings can be configured from any web browser.

Each input from your custom controller can be connected to any slider, button, or other interface item in VDMX using the Interface Inspector.

Each input from your custom controller can be connected to any slider, button, or other interface item in VDMX using the Interface Inspector.

Once again a special thanks to Will Reardon for preparing this lesson and be sure to check out the x-OSC website for more information and additional tutorial on using it with other software.

Using a Step Sequencer to trigger media clips

$
0
0

Download the completed project file and sample media for this tutorial.

Note that the technique used for this example will also work using a different VDMX data-source generator, such as an LFO, or an external software / hardware device sending MIDI, OSC, or DMX values to the media bin.

Along with being able to control any standard UI elements like sliders, buttons and color pickers, tracks in the step sequencer plugin in VDMX can be used to automate the changing of the media files playing back on a layer and create visuals rhythms. This general technique can be useful for a lot of projects such as VJ performances and building interactive video installations.

For this video tutorial we'll start with a simple project that contains a media bin and step sequencer plugin.

The first step will be to change the published value range for the step sequencer track from its default 0.0 to 1.0 (aka 'normalized') to its integer (aka 'index'). Once this change is made we can use it as the data-source for the 'Trigger by Index' receiver in the Media Bin plugin.

After you've finished this tutorial try using this technique along with automatic Waveclock BPM detection or learn how to add a color track to the step sequencer.


Notes:

Disable 'Normalize values' for the sequencer track.

Disable 'Normalize values' for the sequencer track.

Use the 'Trigger by Index' row in the Media Bin options.

Use the 'Trigger by Index' row in the Media Bin options.

Viewing all 156 articles
Browse latest View live