Special Topics Module 2

Research & Activity Documentation

Wanqing Li


Project 2


Module 2

In Project 2, I explored the features and underlying logic of TouchDesigner. I have been experimenting with audio-visual interaction and shaped light-sculpting. I used audio to drive the visualization and the model to test the light sculpting effect. And combined with the projector. I have gained a deeper understanding of TouchDesigner's application in real-time visualization and immersive experience design.


Workshop 1

In this project, I experimented with TouchDesigner's CHOP, SOP, and TOP components, exploring different visual effects and interactions.1. Mouse following2. Further experimented with particles, dynamic halos, and sensor interactions, and researched how to enhance the visual hierarchy through color overlays and parameter optimization.3. Experimented in depth with halos, gradients, and blurring, and the natural light emitting changes and other visual effects of the logical derivation. These explorations helped me understand TouchDesigner's real-time interactions, light and shadow control, and visual hierarchy construction, and laid the foundation for more complex visual projects.

To explore and restore CHOP and SOP functionality as well as coordinate synchronization, and dynamic interactions with trailing effects. To prepare for the attempt to implement Leap Motion gesture dynamic data collection and coordinate synchronization between objects in Project 3. Try to explore and restore dynamic halo effects, color overlays, color residuals, and color gradients to give a sense of flow to the visuals. Trying visual effects glow, gradient, blur and color mapping as I need to achieve this effect in my next target (cube).

Project Management

I would like to restore my P5.js project, which is based on music-driven 3D interactive visuals, in TouchDesigner and try all possible ways to implement audio-visual interaction.

To achieve this, I thought I would need to rely heavily on a combination of TouchDesigner's CHOP, SOP, and TOP components to achieve all the results.

In terms of project management, I'll be using an iterative Experiment-Compare-Optimize approach, testing a lot of different scenarios. First, I set the core goal of having the cube dynamically change based on the audio signal, and tried different ways of changing size, color, and position to find the best path to achieve this. Second, I continuously adjusted the parameters, optimized the interaction logic, and recorded the effects of different approaches in order to finally choose the optimal solution and integrate it into a complete audio-visual interactive system.

Workshop 2

Analysis: what components do I need to learn that will help me achieve my goals?

1. make sure can make audio data drive changes by learning Audio CHOP (Audio Signal Analysis).

2. Study 3D cube creation and control, and use Transform, Instance to make multiple cubes respond to audio.

3. Dive into color and lighting to make visuals more layered.

4. Research multi-object interactions to ensure that can synchronize the movement, color change, and other behaviors of multiple cubes.

Learning Objective: Create 3D cubes with SOP components (Sphere SOP, Box SOP), use Transform SOP and Instance to make multiple cubes change at the same time, and make the position, size, and rotation of the cubes linked to the audio signal. Learning Objectives: Create dynamic color changes with Ramp TOP and Lookup TOP, generate glow and blur effects with Glow TOP and Blur TOP, and add lighting effects with Phong MAT and PBR MAT. Learning Objective: Make multiple cubes move in different frequency bands of audio.
        Manage multiple data inputs using CHOP Merge and Select CHOP.

Action Research Testing

I tested different approaches, e.g., making the color, size, and motion of the cube dynamically respond to the audio signal and iteratively adjusting the audio resolution, geometric transformations, and color mapping


Action Research Cycle 1

I tested multi-level nesting of 3D cubes in TouchDesigner, which was designed to simulate uniform transformations of different sized cubes in P5.js driven by music. I tried a 2D nested structure, but it's harder to implement multiple levels of control in TD alone, especially when keeping the smaller cubes affected by the music and the larger ones stable, which complicates data mapping and interaction logic.</p></a> This is an optimization of the audio response of a single cube. I tested dynamic changes in size, color, and glow to make the cube expand at low frequencies, and adjusted Glow and Blur to enhance the visual effect. This step is to ensure that the underlying audio-visual interaction logic is correct before attempting to synchronize the control of multiple cubes. Focusing on the proximity of the visuals, visual styles for rounded corners and outer frame tracer lines were added to match my P5js. Also added inner glow, scroll raster and outer glow blur effects. Visual and audio interactions, with the cube size change.

Action Research Cycle 2

For audio data parsing and mapping, I split the music into frequency bands and mapped the different voices to the cube's attributes individually, allowing for size changes, color gradients, and motion adjustments driven by the rhythms of the audio signals at different frequencies. Focusing on human voice interaction, the microphone captures real-time sound and maps it to the cube to explore a more natural way of interaction reflecting the real-time nature of audio-visual interaction.

Project 2


Prototype Research and Development

In the final stage, I recreated the P5.js music 3D interaction in TouchDesigner, exploring different ways to refine the audio-visual experience. I tested six variations, from single and dual cube interactions to 2D nesting and multi-dimensional transformations, ensuring the cubes dynamically react to the tempo and different audio frequencies. Each approach was fine-tuned with color mapping, dynamic lighting, particle effects, and shape deformations, ultimately creating a system where low frequencies drive size changes and high frequencies control color shifts, making the interaction feel more immersive and responsive.

Each solution is optimized for different visual and interactive needs, including color mapping, dynamic light effects, particle following, and morphing, and finally realizes a complete interactive system based on music with low-frequency-driven size changes and high-frequency-controlled color changes.
×

Powered by w3.css