Quantcast
Channel: Tutorials - VDMX - MAC VJ SOFTWARE
Viewing all 153 articles
Browse latest View live

Making a customized version of 'Grid Pro' in VDMX

$
0
0

Download the completed Grid Pro template for VDMX along with some sample clips in Hap 720p.

Previously in this series, recreating the classic VIDVOX Grid using VDMX.

In this video walkthrough we'll show how to turn the basic triggering and scrubbing app Grid to it's big brother Grid Pro, which added a number of powerful features like transitions for fading between clips, image FX processing, audio analysis, a text generator layer and direct to disk movie recording.

For people already familiar with using Grid Pro for video DJing and mixing, the completed template is a good starting point for jumping into VDMX as a pre-built VJ instrument with your existing movie clips.

Next, look at the tutorials to customize this layout to work with MIDI and OSC controllers, or further extended to a four channel mixer template.

Tips for using this template:


Adding visual FX to the Right layer. Separate FX chains can be applied to the Left, Right and final Mix.


The 'Left' layer renders on top of the 'Right' layer. Change the 'Left' composition mode to adjust the type of blending used to fade between clips.


The pop-up menu in the 'Preview' window plugin can be used to change the viewed layer or mix group. Use the UI inspector to assign keyboard shortcuts.

Tips for building this template from scratch:


Set the voice mode of the 'Media Bin' to 'Cycle' and use the 'Auto-Fade' option in the Two Channel mixer for transitions.


Create a layer with a text source to recreate the Grid Pro Fontsynth for overlays.


Audio Analysis and LFOs controllers from Grid Pro are available as plugins in VDMX.


Screenshot for original Grid Pro video instrument.


Completed Grid Pro template for VDMX.


Intro to Making Audio / Video Loops for DJs and VJs

$
0
0

Download the audio tracks, video footage, and completed A/V loop for this tutorial.

An addition to our regular tutorials on how to use VDMX for real-time video mixing, one of our goals on the VIDVOX blog is to help people get better at using some of the other equipment that video artists, particularly VJs, come into contact with in the process of creating their own original live performances.

For this A/V performance technique lesson we'll be taking a look at some of the begineer tools that are used by musicians and DJs to make music, followed by a quick intro in creating video loops from scratch. To top it off the two separate tracks will be combined into a single A/V movie file for use in VDMX or other VJ software for live remixing.

Often the biggest stumbling blocks with getting into making a new kind of art is figuring out where to begin and overcoming the initial phase of not being very good at it - with that in mind, whether it is sound or image that is new to you remember the goal here is to just have some fun becoming familiar with working with a new kind of media.

ps. Releasing a set of creative commons A/V clips gets you on the way to the 'starving artists discount' for VDMX.

If you're a musician or DJ with an existing library of sound files, you can probably skip to the 2nd step on shooting and capturing video. Likewise, video artists and VJs who have banks of prepared videos can spend more time focusing on step 1 dealing with creating sound files to sync up with their movies in step 3.

After completing this tutorial, move on to watching the guest visit from The Eclectic Method explaining his process for creating remixes using A/V samples triggered by Ableton Live and VDMX.

"Dual deck" remixing a single A/V loop on two layers in VDMX with a simple mixer.

1. An Introduction To Making Audio Loops:

For people just beginning with making music, there are a lot of great apps available on the Mac and iOS that can be used as a starting point for creating a wide variety of sounds in different styles and genres depending on what you're looking to make. Many of these programs come with their own sound libraries and generators, and others will let you capture and process sounds sampled from a microphone. While you may eventually find yourself moving on to more pro software like Traktor or Ableton Live in the future, to begin with we recommend trying some of the simpler audio loop making apps.

Here are two quick lessons in using iMPC and Figure on the iPad to jam out original loops and render them to fixed sound files that can be copied over to a Mac to be later synced up with a video. We'd also recommend checking out Garage Band and exploring the app store for other unique instruments and sounds.

Thanks to Kevin Luddy for demoing these apps!

2. An Introduction To Making Video Loops:

One of the top questions we get from new VJs, whether they are coming from a musical background or otherwise, is where to get original material for mixing and projecting during shows.

For most people these days, the quickest way to gather starter material is using the camera from a smart phone or tablet - you can usually find something to shoot, whether it is an interesting texture, a reflection of light on a window, or your friends having fun. Try playing with different kinds of shots such as close up vs far away, lots of motion vs slow moving, or blurry vs in focus. Don't forget you can listen to the audio loop a few times beforehand to get a sense of the timing and flow.

To make our basic video loops to sync up with our prepared audio tracks from step one, we've simply recorded a few seconds of raw footage of spring flowers on an iPhone camera (below, right) and copied them over to a Mac to be further edited in step 3.

Tip: If you don't have access to a camera, or are more interested in digitally rendered video, Apple Motion is another great place is begin for creating your own visuals from scratch.


Use the free 'Image Capture' utility, iPhoto, or DropBox app to copy movies and images from the iPhone or iPad camera.

3. Using iMovie to Create A/V Loops From Audio and Video Samples

For the final step in this tutorial we'll be using the simple video editor iMovie to loaded our prepared tracks and combine them into A/V movie loops that can be used for live performances with VDMX.

If you don't have your own files, download some example audio tracks and raw video footage.

To begin, first import the video file you've created (or downloaded) in steps 2 into a new iMovie project. Scan through the raw video footage until you find a section of equal length with the right the pacing and drag them into the timeline to create a new project.

Next add the audio loop as a soundtrack using the ♫ view from the toolbar. Double click on the clip to open the inspector to fine tune the colors levels, apply automatic image stabilization, and other video effects to the file before exporting.

Notes:

When using the 'Export using Quicktime' option in iMovie remember to use the PhotoJPEG, Apple Intermediate or Hap video codec to get the best playback in VDMX.

Add audio loops to your iTunes library to access them from within the iMovie timeline.

Read up more on topics like movie playback and audio analysis in VDMX.

Guest Tutorial: 4 ways to sync VDMX and Ableton Live with Mattijs Kneppers and Studio Rewind

$
0
0

Download the completed Ableton Live set and VDMX project for this tutorial and the Livegrabber plugins.

When using VDMX to create visuals or VJ alongside other software tools it is possible to send control information back and forth by using open standards such as MIDI and OSC. This is particularly useful when using other software to make music, or when collaborating on live performances with musicians for keeping parts of the audio and video tracks in sync.

One of the most popular tools for creating music today is Ableton Live, which in addition to natively supporting MIDI in and out for syncing with hardware controllers and other software, recently added a powerful new way for users to customize their projects with custom plugins written with Max/MSP.

Over the last few years artist / programmer Mattijs Kneppers has been developing “Livegrabber,” a set of plugins that can be used to easily sync Ableton Live with other applications by echoing out actions in Live over the OSC protocol. For the release of the newest version of the plugins, we're joined by Mattijs and Auke from Studio Rewind to show off four different ways to use the Livegrabber plugins with VDMX.

For more information about the plugins visit the Livegrabber website: http://showsync.info/livegrabber

Also check out more tutorials on using Ableton Live with VDMX or the basics of MIDI / OSC setup.

Notes:


In the VDMX Preferences under 'OSC' set up the Input Port to receive messages from Livegrabber.


Use the 'Convert Drums to New MIDI Track' option in Live 9 to create control data from an audio track.


Applying the 'Single Note Grabber' plugin to a MIDI track in Live by dragging from the Finder.

Three Different Ways to use Audio Analysis (or other data-source) to Trigger a Movie in VDMX

$
0
0

In this quick tutorial we'll be looking at how to use the Audio Analysis plugin instead of the usual keyboard, MIDI, OSC, or DMX shortcuts for triggering video clips in a media bin. This can be a particularly useful technique for using VDMX to run interactive installations that respond to sound and other inputs, or for automating and beat syncing parts of a VJ setup.

When following along with this tutorial, if you don't have an audio input (or if it is inappropriate to make loud noises in your workspace) you can try using another data-source plugin such as an LFO or Step Sequencer in its place instead.

Once finished try using this same basic idea with the Two Channel Mixer plugin to make audio reactive transitions and cuts between layers.

Notes:


In the ‘layer source controls’ set the movie loop mode to ‘cut to black’ to have the clip stop at the end.


Assign data-sources to trigger clips in the Media Bin inspector and use the UI Inspector to adjust the threshold level.


Previous, Next and Random clips can be triggered using buttons in the main Media Bin interface.


The Quantization option can be used to queue or postpone clip triggers until a data-source activates.

Bar graph visualizations with Quartz Composer and VDMX

$
0
0

Download the completed QC composition and VDMX project file for this tutorial.

Among the many uses of Quartz Composer, and perhaps one that is often overlooked for live performance visuals, is the ability to make visualizations of data and other information. Since QC compositions are rendered in realtime, when creating them for this purpose it can be helpful to make the patches reusable with published input values for use in other environments, or replicating within QC itself.

In this two part video tutorial we'll first create a composition in Quartz Composer that renders a simple bar graph data visualization with inputs for changing its labels and values that can be used in other applications. Once the patch is completed it can be loaded into VDMX and set up to respond to a variety of different inputs as a visualizer of tempo or audio levels as part of a VJ set, an on screen display of MIDI / DMX channels, or any other arbitrary data that you can get into your computer by OSC.


A Quartz Composer patch used to create two separate 4 column bar graphs in VDMX visualizing tempo and audio analysis levels respectively.


1. Creating the Bar Graph Composition in Quartz Composer

If you aren't interested in how the patch is made, download it and move on to the VDMX part below.

Note that this quick bar graph example is designed for beginners to the Quartz Composer language--

More experienced QC users following along may want to try the additional challenges of working with ‘Cube’ objects instead of a ‘Sprite’ to create a 3D chart, make use of an ‘Iterator’ to make the number of columns adjustable along with the other parameters, or include published input splitters for color values.


2. Using the Bar Graph Visualizer Composition in VDMX

The reusable Quartz Composer composition created in part 1 now makes it possible for it be used to visualize any data-source available to VDMX as a bar graph on a layer--

In this video we'll go through a few of the built-in options such as audio analysis and LFOs, as well as using the Hardware Learn option to quickly detect some MIDI sliders to the patch inputs.

Finally, we'll end by demonstrating the loading of the patch multiple times to use on more than one layer for displaying different sets of real-time data side by side.

3. Next Steps & Pro Tips...


Enable ‘Quad’ mode in the Layer Composition to apply perspective correction for basic projection mapping of the graphs onto surfaces.


Compositions can also have published outputs for returning numbers, colors, images and other control data back to VDMX for visualization.


Advanced: The text fields for QC inputs can receive their string values over OSC - use the UI Inspector for setup.

Using Max/MSP/Jitter as an external FX send and data-source provider for VDMX

$
0
0

Download the completed Max/MSP/Jitter patch and VDMX project file for this tutorial.

Eventually when creating live visuals, particularly for a high profile event or tour, you may find the need to add to your setup some kind of very specialized custom image processing, source generator, or information feed that really sets the show apart with its own unique style or effect.

Since this may also only be one part of a show, often it's necessary or at least desirable for a VJ to be able to run everything from movie playback to output within VDMX without having to switch between applications while performing. This is made possible by loading the additional custom code either directly as an extension to VDMX, for example as a Quartz Composer composition, or by running it as a patch in a separate application that can communicate video and data feeds between programs using open protocols such as Syphon and OSC messages.

For this set of video tutorials we'll be taking a look at how to use one of our favorite visual programming languages, Cycling74's Max/MSP/Jitter which has been around for over 20 years as the tool of choice for creative coders experimenting with sound and video.

If you'd like to follow along with these lessons, begin by installing Max and the Syphon externals for Jitter on your Mac.


Max/MSP/Jitter patch running as an external FX send and data-source provider for VDMX layers.


Part 1 - Setting up a FX Send to Jitter that returns to VDMX over Syphon

To begin we'll first set up a very basic VDMX project that includes a ‘Syphon Output’ plugin for publishing a source layer to send a test video stream to Jitter.

Next in Max the jit.gl.syphonclient and jit.gl.syphonserver objects are used to create a patch that simply sends back out the incoming video frames to return to VDMX as a pass-through. This patch can be used as the starting point for creating an FX as demonstrated in the next step on this page and also is included in the download as a template.


Part 2 - Building the ‘Auto-Levels’ FX in Jitter

If you aren't interested in a lesson in Max/MSP/Jitter, download the completed patch and move on to part 3.

Once the initial setup for Syphon is complete the Jitter patch is ready to be set up as an FX to send modified images back to VDMX.

For this example we'll make an ‘Auto-Levels’ that uses the jit.3m object to compute the minimum, mean, and maximum brightness values of each incoming video frame. This data is then used by a jit.gl.pix to adjust the luminance curve in real-time, normalizing the input by making each pixel in the image brighter or darker.


Part 3 - Using Max as a data-source generator for VDMX over OSC

Along with video frames, it can be useful to send back some of the information being gathered by our Max patch for the purposes of driving FX parameters or source controls within VDMX. Likewise, a Jitter patch may have settings that you may want to access without having to switch programs.

Since both applications support the MIDI and OSC protocols in both directions, it only takes a minute to set up this two way talkback.


Additional Notes:


Jitter users can take advantage of the GPU accelerated Hap video codecs using the jit.gl.hap external.


When working with other applications that don't support Syphon, VDMX can still capture output with the ‘Window Inputs’ source.


See how to create the ‘Bar Graph’ data-visualizer Quartz Composer patch.

Guest Tutorial with Roger Sodré: Connecting VDMX and Blendy VJ by Syphon

$
0
0

The rapid adoption of Syphon on the Mac for passing video streams between applications has made it possible for many developers to focus on creating new more highly specialized tools that are not needed for the everyday VJ, but prove to be an invaluable solution in certain common situations or high-end productions.

Today we are joined by Roger Sodré, creator of Blendy VJ which is an advanced tool for working with multiple projectors in situations where there is an overlap that needs to be smoothly blended to appear as a single continuous output--

“Blendy VJ solves the problem many Vjs, artists and projectionists face when more than one projector is needed to cover an area with light. It is usually impossible to make a perfect seamless alignment of multiple projectors side by side. Either you’ll have a gap between them, doubled light, or worse.


Overlapping dual projections.


Dual projections corrected with soft edge blending by Blendy VJ.

The solution is to overpose the projectors and fade out the edges of each one to make a seamless blend. Due to the nature of light, this fade is not a simple gradient, but a gamma curve. Blendy does all the math to get the perfect soft edge, for as many projectors as you need.”

In this guest tutorial Roger demonstrates how to set up VDMX to send video over Syphon to Blendy VJ where it is split into two separate signals that then have a soft edge blend applied over the section that will overlap when projected.

More information and tutorials can be found on the Blendy website: http://www.blendy.in/vj/tutorials/

Notes:

In addition to receiving video feeds from VDMX over Syphon, Blendy VJ can natively work with movies encoded into the GPU accelerated Hap video codecs for high efficiency playback of video files.

For those who are interested, Roger has shared the source code for the Hap extension to Cinder for other developers to use on his github page.

Soundtrack for this tutorial is Kill All DJs, by Beatification, from Bolivia. http://www.mozcu.com/909-kill-all-djs.html


Blendy VJ performing advanced edge blending for three separate projections.

Introduction Quartz Composer Tutorials translated into Portuguese

$
0
0

These introduction Quartz Composer and FX tutorials have been translated from English by Daniel Neves.

Título: Efeitos visuais - Básico

O arquivo do projeto criado para este turorial pode ser obtido AQUI or Read this tutorial in English.

Passo 1: verifique a camada (layer) no "Workspace Inspector".

Dica: Para abrir o FX-Chain em sua própria janela clique no botão expandir (expand).

Passo 2: Escolha um efeito a partir do menu pop-up para substituir / adicionar à pilha de efeitos atual.

Dica: Efeitos são listados por categorias. Use o Gerenciador de Assets para criar seus próprios grupos customizados com os efeitos que você usa frequentemente.

São suportados os formatos: CoreImage, Quartz Composer e FreeFrame / FFGL.

Passo 3: Use o "Local Preset UI" para rapidamente gravar e recarregar configurações de efeitos em camadas (layers).

- Clique no botão + para gravar a pilha atual de efeitos.

- Clique no nome de uma configuração gravada para recarregá-la.

A seguir nesta série, acesse "usando um efeito mascara de camada (layer)" ou "criando um efeito customizado com Quartz Composer".


Mudando o efeito de um layer.


Exemplo de configuração com pré-visualização antes e depois.


Titulo: Fontes Quartz Composer publicando Entradas.

O exemplo completo, projeto VDMX e o patch do Quartz Composer para este tutorial podem ser obtidos AQUI or Read in English.

Materiais: O que voce precisa para este tutorial: 

  1. Quartz Composer

Passo 1: Adiciona um Objeto Divisor de Entrada (Input Splitter Object) a partir da Biblioteca e configurar o tipo de dado na janela Inspector.


Alguns tipos de entradas possuem parametros adicionais para configurar o range e o valor padrão. Estes parametros são acessados a partir do Inspector.

Passo 2: Use o menu contextual (clique da direita) no Divisor de Entrada (Input Splitter) e selecione a opção em Publicar Entradas (Publish Inputs).

Passo 3: Arraste o patch até o Media Bin dentro do VDMX e carregue-o. As Entradas Publicadas do patch aparecerão como objetos padrão de VDMX UI.


Repita este processo para cada parametro que você deseja controlar de dentro do VDMX.



Título: Criando efeitos com Quartos Composer

Este projeto de exemplo e a qcFX criadapara este tutorial podem ser obtidos AQUI or Read this tutorial in English.

Materiais e Pré-requisitos:

  1. Quartz composer
  2. Ler o tutorial "usando o objeto Divisor de Entrada (Input Splitter Object)" no Quartz Composer.
  3. Ler o tutorial "adicionando efeitos a uma camada" no VDMX.

Passo 1: Adicionar um "Input Splitter" ao seu Quartz Composer patch e configurar seu tipo para Imagem.

O stream de video vindo do VDMX será passado a partir daqui. Conecte isto às partes do patch que processam a imagem conforme a sua necessidade.

Passo 2: Publicar a entrada (input) para as imagens usando a chave "inputImage".

Adicione e publique divisores de entrada (Input Spliters) à qualquer variável para controlá-las pelo VDMX.

 

configurando o tipo de um objeto entrada dividida (input splitter) para o tipo Imagem.


Publicando a entrada usando o menu contextual.


Configurando a chave publicada para "inputImage".

 

Passo 3: Escolha "Abrir pasta Assets no Finder" (Open Assets Folder in Finder) a partir do mennu "Ajuda" (Help) e mova a composição para a pasta qxFX. O efeito agora é exibido na lista de efeitos disponíveis.

 

Adicionando um efeito a uma camada (layer).

 

Titulo: Criando Players de Texto Customizados com Quartz Composer para VDMX

Faça o download do projeto completo e o arquivo de exemplo do Quartz Composer para este tutorial or Read in English.

Camadas de texto são similares a outros geradores de vídeo do VDMX, porém com alguns recursos na interface de usuário (UI) para trabalhar com palavras e letras. Abaixo estes ajustes especiais, estão todos os parametros adicionais e específicos da composição Quartz Composer que serão usados para renderizar o texto como vídeo.

 

Arquivos de texto podem ser carregados do disco ou criados a partir de uma das fontes embutidas do menu contextual.


Os controles do player para a entrada de texto permitem avançar no arquivo letra por letra, por palavra, por sentença ou por quebra de linha.

 

Antes de criar seus próprios estilos de texto no Quartz Composer, é recomendado que você leia o tutorial Criando Efeitos baseados em Quartz Composer para VMDX.

"Text Source Composition" é o menu pop-up que seleciona qual "patch" deve ser usado para renderizar o texto para a saída. Estes patches do Quartz Composer podem ser encontrados na pasta qcTextSources no diretório Assets.

O protocolo especial de geração de texto em composições para VDMX são desenhadas para pegar um texto na entrada e exibi-lo na tela. Quando um arquivo de texto é lido no VDMX5, os caracteres individuais, palavras e sentenças seão passadas para os patches do Quartz Composer para criar uma interpretação visual do texto. Estes patches especials de texto precisam ter uma divisão de entrada (splitter input) do tipo "String" com a chave publicada "FileInput".

Uma vez adicionada a lista de recursos do VDMX esta composição de leitura de texto será acessível a partir do menu "Text Source Composition" nos controles de camada (layers control) quando arquios de texto estão sendo lidos.


Adicionando uma entrada dividida (splitter input) com o título "FileInput" para criar um patch qcTextSource.


Entradas de texto que terminam com o sufixo "_FontMenu" aparecerão como um menu pop-up para seleção de fonte quando carregadas no VDMX.


A composição "Example Text" usada para renderizar um arquivo de texto no VDMX com o seleção de fonte e controle do player para textos.


Remixing Seamless 360 Degree Panoramic Movies Shot with the Kōgeto Dot in VDMX

$
0
0

Download the completed project file and example panoramic movies for this tutorial.

As part of our series on different techniques for VJs and visual artists to create their own content for performances and installations, today we are featuring the “Dot” camera attachment from Kogeto which lets you easily shoot panoramic video from an iPhone that can be loaded into a VDMX for real-time cropping and panning for adjusting the point of view as the movie plays back or is remixed during a live performance.

While the idea of the Dot attachment is fairly simple, for a VJ the possibilites for creating interesting footage that takes advantage of a fourth dimension of freedom are pretty endless, particularly when combined with MIDI controllers and audio analysis for driving playback parameters.

In this video tutorial we'll look at some tips for working with movie clips shot using the Dot in VDMX, including setting up seamless 360 rotational loops, syncing the POV angle to the movie time, and how to fade between two different panorama clips.

From here try to add a live camera feed to the project or how to auto import your panoramas and photos shared from your iPhone via DropBox.

Notes:


Use the “Fill” sizing mode in the Layer Composition settings to get a “pan and scan” style output.


Add the “FrameHold-Manual” geometry adjustment FX to create a seamless wrap around control of the camera point of view.


The thumbnail aspect ratio of each media bin plugin can be adjusted in its sub-inspector under ‘Options’


Tap the “Save” option to export the full panorama to your camera roll.


Use the free Image Capture utility or iPhoto to retrieve movies from your phone.


For best playback performance in VDMX make sure to convert your movies to PhotoJPEG, Apple Intermediate or the Hap video codec.

The Dual Mix 4 Layer Livid OhmRGB Slim Template

$
0
0

Download the completed VDMX project file and sample movies for this tutorial.

If you don't have a Livid OhmRGB Slim, this is still a great tutorial / template to review for the general technique.

This template is a variation on the four channel video mixer template in which we have two groups of layers (left and right bus) set up making it possible for a VJ to prepare, preview and transition between mixes on the fly, similar to how a multi-deck DJ works live. Here the idea is similar, but instead of two mixes that use separate sets of clips, in this template each bus shares the same set of four sources, each with its own set of controls for layer opacity, FX and blend modes. For an added twist, the fourth source for this template is used as a mask overlay for each bus that can be used in a variety of ways to combine the two separate mixes together in the main output.

In this video tutorial we quickly go through the design of this template and how it can be used to mix two movie clips along with an audio reactive or LFO driven Quartz Composer composition as an overlay or mask.

Also in this series of templates check out the basic two channel video mixer Livid OhmRGB template, or try using the Movie Recorder plugin to capture the video output from this one.

Notes:


Left and right side faders set opacities for sources 1, 2, the overlay, and the masking for each bus separately.


Use the Control Surface plugins to change layer settings by mouse, or remap to other MIDI instruments.


Create different variations using the same set of sources and apply layer masks to black out areas before mixing.

DJ Mixer EQ Style Masking FX for VDMX (with bonus X-Session Pro template)

$
0
0

Download the completed X-Session Pro template file and Quartz Composer FX for this tutorial.

For a musician looking to add visuals to their existing live performance, or a VJ aiming for tighter sync in their A/V collaborations, it's often useful to come up with meaningful translations or parallels between some of the basic ideas behind the generating, mixing and layering of sounds and images–

In this technique tutorial we'll focus on two different ways the idea of a DJ style low, mid, high EQ control can be interpreted in the world of video as FX in VDMX as a means to mask out or adjust the gain level on separate discrete parts of a video stream for the purposes of blending video layers together. The first example exchanges the low, mid and high levels for the individual RGB channels of the image for raising or lowering the intensity of each independently. The second qcFX uses a similar concept to a 3-band equalizer, breaking down the image into three different sections based on the luma (brightness) level of each pixel instead of its frequency ranges.

Since many of the DJ style controllers for computers are MIDI based, once these conceptual connections are drawn between two similar concepts, it is easy to set up VDMX and music making software to run perfectly alongside each other. Here we are using the X-Session Pro from M-Audio, but the underlying technique can be used with any similar instrument.

Notes and Next Steps:

The custom FX for this tutorial were made in Quartz Composer and can be further customized with additional input parameters, or see how to drive the clock plugin to the BPM from Ableton Live.


Removing (masking out) the brightest part of an image with the “3 Band Luma EQ” FX to show through to the background layer.


Using the included “RGB-EQ” FX to fade movie layers in and out, one color channel at a time.


Our completed VDMX template, a two channel DJ style video mixer for the M-Audio X-Session Pro with EQ and movie controls.

VDMX Question from the forums: How do you set an effect MIDI knob to be off when in the center position?

$
0
0

Download the completed project file for this tutorial.

This question comes to us from the VIDVOX forums, and is most easily explained with a quick demonstration– the goal is to have a MIDI knob that is used to make the video become more pixellated as it is turned left or right, but is a regular pass-through when set to its center point.

In this video we'll show off how to use an LFO plugin in VDMX to create a lookup curve for mapping a MIDI knob to a different range of values to drive our pixellate FX being applied to a layer.

Once you've got the hang of this, move on to the next tutorial lesson on using DJ style EQ knobs with masking FX which includes two custom Quartz Composer compositions that can be tweaked with your own visual style.

Notes:


Use the data-sources from the LFOs to adjust the levels of one or more FX at a time with a single MIDI knob.


Right-click to add waveforms and lookup curves to the LFO plugin.


From the UI Inspector add a "Mark" at the 0.5 position on the time slider to easily jump to the exact center value by mouse click.

Creating Gesture Based Controls for VDMX using the Gestrument Kinect MIDI controller app

$
0
0

Download the completed VDMX project file for this tutorial. 

Last week featured on CreateDigitalMusic we caught wind of Gestrument Kinect which is currently in beta, a simple Mac app that can be used to convert the camera depth data from a Kinect to MIDI for controlling music or VJ live visuals. Since it sends of standard MIDI, it only took a few seconds to connect it to VDMX for a quick demonstration on using its gestures to trigger events and adjust video FX parameters.

The four pieces of provided control data from Gestrument Kinect are active signal, x and y blob position and dynamics (weight). These values can be used together in a variety of different ways within VDMX to set up controls that respond to actions such as waving a single hand from right to left.

First, to get the video signal we can use the Window Inputs feature to capture and crop the on screen display from the Gestrument Kinect application.

Next we'll use a Control Surface plugin to create custom data-sources that represent different actions based on the incoming MIDI data and use the results to advance to the next movie in a media bin.

Notes and next steps:

Once you've got this mastered, try to control a video feedback loop or to crossfade between four different layers.


Adjust the Depth range slider in Gestrument Kinect until only your hands are visible.


In the ‘Workspace Inspector’ under Vid In, enable ‘Window Video Inputs’.


Completed VDMX project with ‘Right to Left’ and ‘Bottom to Top’ hand wave gestures.

Making custom face tracking video FX and data-sources for VDMX with Quartz Composer

$
0
0

Download the completed Quartz Composer FX and VDMX project file for this tutorial.

Among the video analysis tools available within Quartz Composer is the “Detector” object which examines an input video signal and outputs the size and coordinates of any located faces within the frame. This control information can then be used for automatic real-time image cropping from live camera feeds, adjusting the parameters of FX while VJing at a concert, or creating interactive art installations that react to the location of people within a space.

For this demonstration we've made two basic QC compositions using this simple detection object that can be loaded into VDMX to perform basic face capture and replacement FX that can be connected in a variety of ways. You can also use these example patches as starting points for your own patches that perform more complex behaviors like tracking multiple faces within a single frame or publishing additional control information.

The first composition is designed to be loaded as a QC based data-source plugin and provides both a cropped image output of a detected face and the raw values for modifying the levels of any controls within VDMX, such as the size and position of a layer.

Our second example patch is an FX that can be applied to any layer to automatically position an overlay video feed on top of the face located in the input. The overlay provided can be from the cropped output from our plugin, or any other available video source.

Notes:

After watching this, try adding a sound reactive element to this setup or read more about how to extend VDMX with your own Quartz Composer based FX and data-source generator plugins


The example “Face Capture” plugin finds a face and crops it out from its original frame for use in VDMX.


Use the “Face Replacer” FX to replace a detected face with an alternate video stream.


The raw center, width, and height values are also published as data-sources to control other QC patches.


Use either live camera feeds and prerecorded footage for the capture or replacement video layers.


The “Detector” object can also find the positions of the eyes and mouth.


Quickly open the VDMX Assets folder from the Help menu to reveal the plugins and qcFX folders.

Recording a demo reel from a Quartz Composer composition or other video generator to share online

$
0
0

Download the completed project file for this tutorial or view the example shared on videopong.net 

Read how to enter the Videopong “Share Everything Contest” to win a license of VDMX!  (ends July 17th, 2013)

It kind of goes without saying that these days posting your work online is a great way to promote yourself as a VJ or creative coder, or just to make new contacts for future collaborations. Along with a studio or live mix of your visual work, including some of the original resources that were used during the making of your process for other people to learn from is another way to make your mark on in an online art community. 

For this technique tutorial we'll be looking at recording a demo reel that shows off the different ways that your generative compositions can be used in a live setting by using different sets of control data to drive its parameters, such as time based LFOs, MIDI / OSC control, and audio analysis data-source providers. Once we've finished creating the sample movie, we'll also walk through how to share the files using the videopong.net website where they can be hosted, downloaded and remixed by other video artists for free.

To begin, a Movie Recorder plugin is added to the project and set to receive the output from the layer that is playing our sample Quartz Composer composition. Next we'll add a few controller plugins to automate the inputs of the patch to show off different setups.

Once the demo reel video showing off the capabilities of the patch is recorded, it can be uploaded along with the original interactive generator file. 

Notes:

Section Presets  in the layer source panel save and load the input controls for generators.

Section Presets in the layer source panel save and load the input controls for generators.

Use the  Movie Recorder  inspector to adjust the capture settings for video files.

Use the Movie Recorder inspector to adjust the capture settings for video files.

Also control and capture the output from custom applications with OSC and  Syphon .

Also control and capture the output from custom applications with OSC and Syphon.


VJing with a WiiMote game controller in VDMX

$
0
0

Download the completed template project for this tutorial. 

While MIDI devices and OSC enabled applications tend to be most commonly used instruments by VJs when getting away from the computer during a performance, another extremely powerful controller that can be paired with VDMX are WiiMote game controllers which can be connected to a Mac wirelessly over Bluetooth.

When using a WiiMote with VDMX there are four different types of data-sources are available for triggering and adjusting playback, FX, and composition settings in your setup:

  1. Buttons – Use the detect option to assign the same as keyboard press and MIDI notes.
  2. Orientation – The pitch, roll, and yaw of the controller, and optional nun-chuck attachment.
  3. IR Pointer Position - The x and y position of the remote as seen by the WiiMote sensor bar.
  4. Nun-chuck Joystick – The x and y position of an optionally connected analog 2-d stick input.

 

In this video tutorial we'll start by enabling Bluetooth communication and detecting the controller with the WiiMote plugin in VDMX. Once the basic configuration is working we can begin to put together a sample project that is designed to get the most out of the possibilities of the combining accelerometers with game style buttons for performing live visuals.

The WiiMote can also be a great controller to extend existing standard templates such as the basic movie player or the more advanced four channel video mixer.

Notes:

Enable Bluetooth on your Mac and pair the WiiMote from the System Preferences.

Enable Bluetooth on your Mac and pair the WiiMote from the System Preferences.

An active WiiMote plugin and orientation offset settings in the Workspace Inspector.

An active WiiMote plugin and orientation offset settings in the Workspace Inspector.

Assign WiiMote buttons from the right-click detect, UI Inspector, or Hardware Learn.

Assign WiiMote buttons from the right-click detect, UI Inspector, or Hardware Learn.

Syncing the playback of multiple movies in VDMX over a network using OSC

$
0
0

Download the completed project file for this tutorial

One of the data-sources available within VDMX for controlling playback, FX, and composition parameters, is the current playhead position of each movie playing on a layer. Like an LFO or audio analysis value, you can assign this to any slider, button, or other UI item by using the UI Inspector or from the right-click contextual menu.

In this tutorial the movie “normalized time” parameter (time as a percentage, ranged 0.0 to 1.0) will specifically be used to synchronize the playback of multiple movie files – this can be a useful technique for working with batches of clips that have the same duration, and high-end projects that require more than one computer to run all of the displays or projectors.

For this example, we'll begin by syncing two HD movies for output by using the timecode from one of the videos used to control the time position of the other. In the second half, the master time value and clip triggers from our media bin are sent as OSC messages to control movies playback on a second computer connected over Ethernet for low latency communication.

After completing this tutorial, see how to slave a movie in VDMX to MIDI Timecode from another application.

Notes:

The “Normalized Time” data-source value is ranged 0.0 to 1.0 for most situations.

The “Normalized Time” data-source value is ranged 0.0 to 1.0 for most situations.

Send any slider value, including moving time, out over OSC to control another computer.

Send any slider value, including moving time, out over OSC to control another computer.

Receive OSC values to control the movie time on remote computers.

Receive OSC values to control the movie time on remote computers.

Guest Tutorial: Connecting Unity 3D and VDMX by Syphon with Alejandro Crawford

$
0
0

For this guest tutorial we are joined by Alejandro Crawford, the visualist for MGMT (among others), in which he'll show us one part of the setup he uses for creating his live visuals by connecting a scene rendered in the powerful 3D gaming engine Unity to VDMX using the Syphon protocol to pass video back and forth between the two different programs.

To begin, download the Syphon for Unity add-ons by Brian Chasalow which include a template Unity project with the resources needed for publishing and receiving video feeds from other applications to use as a starting point. With the provided scripts the output of the Main Camera in Unity can be made available as a live feed in VDMX where it can be processed and mixed with other sources.

Once you've got the hang of using Syphon, check out some of the other ways to connect Unity to Mac VJ software, such as add-ons for receiving OSC messages from VDMX data-source providers.

 


Notes and Tips:

Download the Syphon for Unity add-ons and templates by Brian Chasalow.

Download the Syphon for Unity add-ons and templates by Brian Chasalow.

The “SyphonServerTexture” script makes the Main Camera in Unity available as a  VDMX source .

The “SyphonServerTexture” script makes the Main Camera in Unity available as a VDMX source.

The  Syphon Output plugin  in VDMX publishes video layers to use in other applications.

The Syphon Output plugin in VDMX publishes video layers to use in other applications.

Add planes and other shapes to display video feeds from VDMX in your Unity scene.

Add planes and other shapes to display video feeds from VDMX in your Unity scene.

Apply the “SyphonClientTexture” script to planes and configure the input settings.

Apply the “SyphonClientTexture” script to planes and configure the input settings.

A Quartz Composer composition in VDMX displayed on a Unity video plane.

A Quartz Composer composition in VDMX displayed on a Unity video plane.


During a live show Unity is just one of the many feeds that Alejandro feeds into VDMX during the course of a performance – custom Jitter patches, AR Drone cameras, and multiple Kinects are all brought together along with techniques like video feedback loops which were used for the Alien Days visuals.

Alejandro's VJ rig, MIDI, cameras, and software all routed through VDMX.

Alejandro's VJ rig, MIDI, cameras, and software all routed through VDMX.

3D Dragon Fly scene in Unity with video plane from VDMX in the distance.

3D Dragon Fly scene in Unity with video plane from VDMX in the distance.

Installing and Using the v002 FX plugins with VDMX

$
0
0

One of the most useful sets of open source FX plugins for Quartz Composer are the v002 collection maintained by Vade and Bangnoise, which are now included as an optional separate package along with VDMX. Included are the v002 optimized fast blurs, “film” image filters, analog / digital glitch, and the Rutt-Etra analog video synthesizer emulator, along with QC based FX ready to use in VDMX or your own QC compositions.

To use the add on plugins with VDMX or Quartz Composer, first run the installer included on the VDMX disk image (dmg). Once installed, the add on QC based FX will appear in the v002 category in the filter picker for Layer FX Chains where they can be applied to any source or layer group within VDMX. Likewise, the plugins the FX are based on are available in the Quartz Composer Editor for use in your own compositions.

Notes:

The v002 installer can be found on the VDMX disk image.

The v002 installer can be found on the VDMX disk image.

Adding a v002 FX to a VDMX layer.

Adding a v002 FX to a VDMX layer.

Use the v002 Quartz Composer plugins in your own compositions.

Use the v002 Quartz Composer plugins in your own compositions.

Enabling hardware “Echo” mode for quick MIDI and OSC talkback with VDMX

$
0
0

 Along with being able to receive real-time control values from MIDI and OSC based instruments, VDMX provides the ability to send the local state of interface items such as sliders and buttons back out to hardware controllers whose interfaces can update dynamically.

To make the setup of two way talkback with devices that support this kind of workflow faster to setup, each UI item in VDMX that is receiving from a MIDI or OSC source can be set to “echo” it's state back to the connected hardware controller. In most cases echo can simply be enabled for a slider, button, or pop-up menu, and from the UI Inspector you can further customize how values are sent back to deal with special cases on your particular device, such as telling pads on a OhmRGB or APC40 to light up a specific color when toggled on or off by sending specific velocity values.

Echo mode syncs with UI items such as sliders, knobs, buttons, pop-up menus, and media bin plugins with hardware MIDI and OSC controllers.

Echo mode syncs with UI items such as sliders, knobs, buttons, pop-up menus, and media bin plugins with hardware MIDI and OSC controllers.


Introduction to MIDI / OSC Echo with the LividOhm RGB and TouchOSC

In this set of video tutorials we'll first demonstrate the basics of setting up “echo” using the Livid OhmRGB MIDI controller and TouchOSC on the iPad as examples.

Also check out the related introduction video tutorials on setting up Media Bin Sync with TouchOSC and and setting up Media Bin UI sync with the MIDI controllers.

Notes:

The “Enable Echo” toggle in the UI Inspector turns on two way talkback. Click on a MIDI or OSC receiver to customize its settings.

The “Enable Echo” toggle in the UI Inspector turns on two way talkback. Click on a MIDI or OSC receiver to customize its settings.

Set custom note on / off velocity values in the “Options” tab of the inspector to match the available LED colors on your MIDI controller.

Set custom note on / off velocity values in the “Options” tab of the inspector to match the available LED colors on your MIDI controller.

The “Control Surface” plugin can import TouchOSC layout files with echo automatically configured.

The “Control Surface” plugin can import TouchOSC layout files with echo automatically configured.

Viewing all 153 articles
Browse latest View live