Skip to main content

TouchDesigner Experiment: OSC data by ZIGSIM

Since Muse headset can deliver brainwave data in the form of OSC, I wanted to try out how to input OSC data to TouchDesigner. To try it out, I searched for how-to's video and article and found this article posted by the TouchDesigner group themselves.

In this post, they mentioned that there are several ways to insert the OSC data stream to TouchDesigner by an iOS application; TouchOSC and ZIGSIM. When I searched for both applications, the TouchOSC app is a paid app, while ZIGSIM is free. So I decided to try ZIGSIM. Unfortunately, the article did not explain in detail on how to use the ZIGSIM app on TouchDesigner.

So from there, I searched for any ZIGSIM - TouchDesigner related posts and found this. This post from ZIGSIM explains that we can send motion data from our device to PC.

There are basically two 'tutorials' that the article showed.

The first one is using another app called ZIG Indicator which can help visualize sensor values. However, ZIG Indicator can only run with the JSON Message Format and cannot be changed into OSC.


The second tutorial is showing users how to connect the ZIGSIM app from our smartphone to the TouchDesigner software. After following the steps, my TouchDesigner can pick up the phone the data and show movement by the displayed graphs. However, the software seems to not connect to the readily available trail, instead, create a new trail, and therefore, the quaternion data is not read properly.

This is how it is supposed to look when succeeding in sending the OSC data to TouchDesigner by ZIGSIM.

This is my result.

If you can look closely at the two boxes on top (the ones that show the moving graph), those lines are newly created when connecting my ZIGSIM app to my computer, instead of taking place of the already existing and labeled graph (the first three lines from the top). This is the main reason why the rectangular 3D cube (the moving shoebox) is not moving on my experiment, as the node connected to it is reading from another data source. Thus far, I have not figured out how to change and connect it to the right source.

Comments

Popular posts from this blog

TouchDesigner Experiment: Audio Reactive Particle Cloud

My second experiment with TouchDesigner is creating this audio-reactive particle visual by following Bileam Tschepe 's tutorial on Youtube. Again, I just followed his tutorial step by step. This tutorial is a little different because it uses both audio and visuals. The visual follows the music in real-time. Other than audio, we are also introduced with the element 'Math' to add the channels of the audio together. This is the end product. Music is FriendsV2 by Muskatt.

TouchDesigner Experiment: Inserting OSC data with OSCIn

From one of my last experiment , I tried to change the data input. Instead of using audio, I replaced it with OSC data from my Muse headset. To connect your OSC device to TouchDesigner, make sure that the IP address and port number is the same so the data transfer can be accurate. In this case, I use a third-party app called Mind Monitor (available on iOS devices) to connect my Muse headset to the TouchDesigner software. Below are screenshots and videos from my experiment. You can see that the brainwave data is already recorded in real-time in the software. Then I used the alpha, beta, and theta brainwave to change the movement of the visuals (the chosen brainwave data I used are just placeholders for now to see the movement). Then the data is connected with 'noise', which is like the fluid/abstract visuals you see on the background. I also set the colors to be moving/changing over time.

TouchDesigner Experiment: Particle Displacement

My first experiment with TouchDesigner is creating this particle displacement visual by following Bileam Tschepe's tutorial  on Youtube. His tutorial is pretty clear on how to create this project, however, I'm going to show a little on how it goes. This is my result.