dance-or-dont(1)

glfmn.io dance-or-dont(1)
Name

dance-or-dont

Postportem of my first hydra performance.

Description

I performed with hydra for the first time at Dance or Don't; it was a blast!

hydra prep

hydra is a JavaScript live-coding framework which intuitively builds GLSL shaders from expressions which look like this:

osc().diff(o0).rotate().out()

I have been practicing making stuff with hydra for a few months now. I've learned a bunch about how it works, and I've still only scratched the surface!

hydra provides an extension method to add custom GLSL via setFunction, which I used to both add custom functions and alter how some of the built-in functions work. In particular, I changed how a few of the functions work to have better support for alpha blending, which I used to build layers in my patches.

I have uploaded my little hydra extension to this site, you can include it in hydra and play around like so:

await loadScript('https://glfmn.io/code/hydra-ludens.js');

I changed shape to have alpha of 0 outside the shape for easier use with layer. I also wanted some more interesting sources to work with, so I made:

dist
Simply outputs the distance from [0.5,0.5] in the UV space; provides a nice and simple way to create interesting shapes and modulation.

I used it in basically every patch I made during my performance to create interesting elements, especially combining multiple scrolled and rotated copies with add or mult.

fract
A wrapper around glsl's fract function, useful to create crunchy-glitchyness. Essentially just multiplies the input and then passes it to fract to get the fractional component.
manhattan
A modified version of [hydra]'s voronoi which instead uses the Manhattan distance function: dist.x + dist.y. All lines between cells align with 45 degree angles. I didn't use this one much, but it does look cool!

I also made another altered voronoi function I called bugeye:

bugeye
Rather than assigning a color to each voronoi cell, this function modulates the UV coordinates to be relative to the center of each voronoi cell. Like the name, it makes the patch look like little reflections of itself in the eyes of a bug.

I had a couple of functions which I couldn't make work in time:

smoothCell
Yet another voronoi-derived function which uses a smooth minimum function to color each pixel, which results in metaball-esque visuals. I couldn't figure out getting smoothness, the curvature of the distance function, and size of the individual cells working nicely together.
withHue
Uses the average rbg pixel value of one image to shift the hue of another image. I had it working as a color function with a vec4 input, but current hydra doesn't properly propagate UV changes to vec4 inputs, even if they are a generator.

I was working on changing this to be a combine function right before showtime, but I had a GLSL compile error so I opted to cut it from the performance.

I also added a little lerp function to help with audio-reactivity.

Here is a patch I came up with the night before in my practice session:

await loadScript("https://glfmn.io/code/hydra-ludens.js");

a.setCutoff(3)
a.setBins(8)
a.setSmooth(0.1)

shape(3, 0, () => lerp(0.7, 2.1, a.fft[2]))
		.fract(5)
		.modulateRotate(dist(0.4), () => lerp(0.5, 2, a.fft[3]*a.fft[3] + a.fft[0]))
		.rotate(0.0, 0.1).layer(dist().color(0.3,0.5))
		.bugeye(3,0.1).scrollY(() => a.fft[1]*a.fft[1]*0.05)
.layer(dist().color(0.02,0.1,0.3).scrollX(-0.3).repeat().fract(() => lerp(1.1, 2.7, a.fft[5])))
    .diff(src(o0).fract([0.4,1.1].smooth()).thresh(0.5,0.1).scale(0.99).scale(1.01)).diff(o0).scrollX(0.01)
    .rotate(() => lerp(-0.01, 0.01, a.fft[0])).hue(0.2)
.saturate(() => lerp(0.8, 1.2, a.fft[6]))
	.out();

The Lineup

The event was called dance or don't, organized by so_so_gutter.

DJs/audio

  1. so_so_gutter
  2. olyX3
  3. ToPher DJ
  4. yulia
  5. Dr. Cr0de

Visuals

  1. cursorhead
  2. elle
  3. me :)

Due to the odd ratio of visual performers, we decided to each take 15 minute blocks of the DJ sets. soso's performance was 30 minutes and ended up being mostly warm-up and extra setup time.

Setup

soso provided a little 5-to-1 HDMI switcher which proved instrumental. The venue's mixer also had an aux send, which using their adapter cables, I was able to plug into my audio interface to get some reliable audio reactivity.

We taped down the audio interface and HDMI switcher; the projector was tilted, and the keystone had the image centered, so I used inspect element to add 100px padding to the element which contained the code to keep it visible on the projector screen.

I used a rust http-server library to serve a local version of my hydra-ludens.js script so I wouldn't need an Internet connection.

I also wanted to run a local version of hydra, but I ran into some hitches with that since I wanted to run with some patches that aren't available on the official version hosted on the net, and the hydra repo uses that version instead of the version in its node_modules. However, I was able to tether internet so it wasn't necessary in the end.

I tried out hydra-tap last minute for getting tap tempo control over hydra's [] syntax. However, I was experimenting with the dev branch, and the C-space key combination interacted badly with the updated code mirror editor. I ended up having to delay my first performance to switch to the main branch because hydra (more likely code mirror) broke completely. It felt like the internal time variable broke, I suspect it had a NaN on the JavaScript side or something of the sort.

Set 1: ollyX3

Due to how we were laid out on the table, I was going to go second, but I ran into the aforementioned bug, so elle went before me instead. Since the time we would each be on was so short, I improvised each patch during the 15-20 minutes before my slot during each set.

I was pretty nervous during the first set, so I kept things simple. I don't remember having much of a concept in mind, except that I wanted to build visual interest with movement across a few different scales: color, distortion, and small shapes.

Here's a snippet of my patch:

await loadScript("https://glfmn.io/code/hydra-ludens.js");
await loadScript("https://hyper-hydra.glitch.me/hydra-tap.js");

a.setCutoff(3)
a.setBins(6)
a.setSmooth(0.1)

osc(30, 0.1, [0.2,2].fast(1)).blend(src(o1).rotate(), 0.2)
  	.saturate(0.8).luma()
	.layer(shape(4,0.1,0.2)
           .shift(0.3,0.3)
           .bugeye(() => lerp(3, 3.4, a.fft[0])), 0.5)
    .layer(dist(() => a.fft[3])
           .repeat()
           .color(0.1,0.2)
           .hue(0.4)
           .rotate(() => lerp(0.5, 0.9, a.fft[2])))
   .hue(0.4)
    .saturate(() => lerp(1, 1.2, a.fft[2]))
    .modulate(o1,[0.1, 0.3].fast(0.2))
  	.out(o1)

render(o1)

This set was a blur for sure, and I don't remember making a lot of big changes to this patch as my 15 minutes came and went.

Special shout out to ollyX3 who was also performing for the first time, and who did a wonderful job!

Set 2: ToPher DJ

This time for ToPher DJ's set I wanted to play more with visual feedback. Since it was easier for me to work with alpha now, I used a bit of scrolling and rotation on the feedback src to leave cool trails behind.

I also tried out a trick that elle has been using to create pseudo-3d in hydra to create a bit of a looking-down perspective, which created a sense of perspective like looking down on a pit of sand.

Here's a rough snapshot of what it looked like at the end:

await loadScript("http://glfmn.io/code/hydra-ludens.js");
await loadScript("https://hyper-hydra.glitch.me/hydra-tap.js");

a.setCutoff(2)
a.setBins(6)
a.setSmooth(0.3)

osc(30,0.1,[0.9,1, 0.5, 2, 4].fast(8)).add(osc(0.4, 0.2, 0.1).rotate(), [0, 0.1])
  .kaleid([4,6].fast(2))
  .luma()
  .blend(src(o1).rotate().modulateRotate(shape(6,0,0.3).bugeye()),0.2)
  .saturate(0.8)
  .modulateScale(dist().bugeye(50,0.9),0.1)
  .saturate([0.2, 2].fast(2))
  .hue([0.2, 0.4, 0.5, 0.7].fast(2))
  .saturate(() => lerp(1, 1, a.fft[1])).rotate(0.4)
  .modulate(o1,[0.1, 0.3].fast(0.2))
  .mask(dist(() => a.fft[0])
        .modulate(dist(0.5).scrollY(0.1))
        .fract([2.1, 12].fast(0.125))
        .thresh(), [0, 1])
       .rotate(0,0.1)
   .diff(src(o1).scrollY(-0.001).rotate(0.01))
  .saturate([0.9, 1.2, 1.2].fast(4))
  .modulateScale(dist(() => a.fft[1]).bugeye(),[0.1, 0].fast(1/8))
.hue(0.3).saturate([0.4, 0.5, 0.9].fast(2))
  .scrollY(() => lerp(0.0, 0.01, a.fft[1]))

  .out(o1)

osc(4,0.25,2).out(o0)

render(o1)

Being so audio reactive, this one felt like the music was pushing sand around. I had a lot of fun starting to comment out some of the bits. A moment I was particularly happy with was when I realized I could stack a bunch of modulate(dist()) to push the shape of the dist().fract() mask around, like this:

dist(() => a.fft[0])
  .modulate(dist(0.5).scrollY(0.1))
  .modulate(dist(() => lerp(0, 0.5, a.fft[2])).scroll(-0.1, 0.4))
  .modulate(dist(0.2).scrollX(0.1))
  .fract([2.1, 12].fast(0.125))
  .thresh(), [0, 1])
  .rotate(0,0.1)

Commenting out the entire mask chunk created a completely different vibe, which was fun to toggle when a drop would come.

Set 3: yulia

This time I wanted to have the focus be on the center of the screen and maybe create a sense of being drawn in. I had watched more of cursorhead and elle's performances in the break time and I had lots more ideas about how I could change up the performance in interesting ways.

Here's a snippet of my patch for yulia's set:

await loadScript("http://glfmn.io/code/hydra-ludens.js");
await loadScript("https://hyper-hydra.glitch.me/hydra-tap.js");

a.setCutoff(2)
a.setBins(6)
a.setSmooth(0.1)

voronoi(
  [10, 10, 10.2, 10.1].smooth(3),
  [0.4, 0.2].fast(1),
  [0.3, 1.0].fast(1/8)
)
  .scrollX(0.25)
  .mask(shape(4,0.4,[0.4, 0.2, 0.7].fast(4))
        .fract(1.5))
  .thresh(0.4)
  .scrollX(() => a.fft[3]*0.1 + 0.01)
  .layer(dist()
         .mult(dist()
               .scrollX(() => a.fft[0]*0.4 + 0.3)
               .fract())
        .shift(0.1,-0.2))
  .scale(1.1).rotate(0.1)
  .diff(o0)
  .scrollY(0.01)
  .rotate(0.01)
  .scale(0.99)
  .layer(dist()
         .fract(4).color(0.3,0.0)
         .mask(shape(4,0.4,[0.4, 0.2, 0.7].fast(4)).fract(1.5)))
  .diff(o0)
  .out()

I used masks to cut out bits of the various layers to create a defined center, and tried to get the colors of the various layers to match up in interesting ways.

I had a bit more fun messing with tap tempo to perform the patch.

My favorite moment of yulia's performance was this track which had an insane arp, which made my head feel like it was spinning. I was mesmerized!

Set 4: Dr. Cr0de

For Dr. Cr0de's set I was trying to get more of a sense of flashing between different scenes. Dr. Cr0de's set was a bit more unpredictable for me than the other performances, so it was a challenge to try to find ways to keep up! I leaned into the audio reactivity more in some parts of the performance, but it just so happened I could create a really different vibe by controlling the tap tempo:

Here's a snippet of my patch:

// My name is gwen but you can call me ludens
// This is hydra, a live coding javascript library for making GLSL shaders

await loadScript("http://glfmn.io/code/hydra-ludens.js");
await loadScript("https://hyper-hydra.glitch.me/hydra-tap.js");

a.setCutoff(1)
a.setBins(6)
a.setSmooth(0.1)

hush()

dist()
  .mult(osc(4.5, 0.1,0.8))
  .out(o1)

dist()
  .modulate(src(o1)
            .rotate(0,0.5),[2, 0.3].fast(1/4).smooth())
  .add(src(o1).color(0.3,0.4))
  .luma()
   .modulateScale(dist(0.5), 0.1)
  .scale([0.3,0.31].fast(1/2),0.5)
  .scale(() => a.fft[4]*0.4 + 1.6)
  .rotate().layer(dist().fract())
  .rotate().layer(dist().fract())
  .rotate().layer(dist().fract())
  .bugeye([2, 4, 3, 20].fast(4))
  .kaleid([0, 3].fast(1/2))
  .color(0.1,0.2,[0.3, 0.9].fast(1/4).smooth(3))
  .hue([0.0, 0.3, 0.5].fast(1/8))
  .layer(src(o0).hue(0.3).mask(shape(4,0.5,0.5).rotate(0,0.4)).mult(dist(0.3).thresh()))
  .modulate(o0, [0, 2, 4].fast(2))
  .modulate(dist(), () => a.fft[0]*0.1)
  .diff(src(o0).saturate(0.3)).modulate(o0)
  .rotate([-0.1, 0.2])
  .modulateScrollX(osc(), 0.01)
.modulateScrollY(dist(), [0.1, 0.01].smooth())
//   .colorama()
  .saturate(() => a.fft[0]*1.6 + 0.5)
  .out()

I had the intention of trying to use 2 channels (o0 and o1), but it proved too complicated to manage.