Skip to main content

Posts

Introducing Halcyon

Halcyon is an immersive installation that allows users to translate their brainwave activity into visual art with the help of Muse headset. This project aims to provide entertainment purposes for the curious mind and research a less explored territory that is brainwave art in digital media. Wearing the Muse headset across the head, it transfers the brainwave to an application called Mind Monitor, which then sends the data to another software called TouchDesigner to project the visuals in real-time. Halcyon, as a brand, is very fluid and abstract, since it produces artwork that is different for everyone. To emulate this goal, the references for this project need to be represented as such as well. Furthermore, as Halcyon is a simple project, I think that mere typography would be sufficient to deliver the aim of the brand. A logo is the keystone of a brand, and it has the power to represent the values a brand upholds. Before proceeding to design a logo, there are some essential rules that...
Recent posts

TouchDesigner Experiment: Inserting OSC data with OSCIn

From one of my last experiment , I tried to change the data input. Instead of using audio, I replaced it with OSC data from my Muse headset. To connect your OSC device to TouchDesigner, make sure that the IP address and port number is the same so the data transfer can be accurate. In this case, I use a third-party app called Mind Monitor (available on iOS devices) to connect my Muse headset to the TouchDesigner software. Below are screenshots and videos from my experiment. You can see that the brainwave data is already recorded in real-time in the software. Then I used the alpha, beta, and theta brainwave to change the movement of the visuals (the chosen brainwave data I used are just placeholders for now to see the movement). Then the data is connected with 'noise', which is like the fluid/abstract visuals you see on the background. I also set the colors to be moving/changing over time.

TouchDesigner Experiment: OSC data by ZIGSIM (Update)

Trying to figure out how to fix the problem I faced on my last post, this time I tried to change the input source into the ZIGSIM-related data shown on the video below.  However, I think the data appointing (which source data is supposed to be which) is still wrong from my side. I have to figure out which is which. Note: The ZIGMSIM on my phone is connected correctly (therefore the two moving graphs on the upper part of the video). Then after trying to changing the source data, the rectangle box is moving but not moving the way I want it to move. It can't really go vertically upwards.

TouchDesigner Experiment: OSC data by ZIGSIM

Since Muse headset can deliver brainwave data in the form of OSC, I wanted to try out how to input OSC data to TouchDesigner. To try it out, I searched for how-to's video and article and found this article posted by the TouchDesigner group themselves. In this post, they mentioned that there are several ways to insert the OSC data stream to TouchDesigner by an iOS application; TouchOSC and ZIGSIM. When I searched for both applications, the TouchOSC app is a paid app, while ZIGSIM is free. So I decided to try ZIGSIM. Unfortunately, the article did not explain in detail on how to use the ZIGSIM app on TouchDesigner. So from there, I searched for any ZIGSIM - TouchDesigner related posts and found this . This post from ZIGSIM explains that we can send motion data from our device to PC. There are basically two 'tutorials' that the article showed. The first one is using another app called ZIG Indicator which can help visualize sensor values. However, ZIG Indicator can onl...

TouchDesigner Experiment: Spectrum to TOPs

This time, the experiment I did with TouchDesigner is creating this audio-reactive abstract wave visual by following Bileam Tschepe 's tutorial on Youtube. This time, Bileam Tschepe's video thought us how to convert an audio spectrum directly to TOPs using the CHOP to TOP and compositing. Following his tutorial, here is my version of the project.  Song by Muskatt .

Designing the final project's visual in TouchDesigner

When creating the visuals for my final project, I was inspired by a 'Spectrum to TOPs' tutorial created by Bileam Tschepe on Youtube (I did a post about it here ). The tutorial teaches the audiences explicitly to convert an audio spectrum directly to TOPs using the CHOP to TOP and compositing. Since Bileam Tschepe used audio as the source information for the visuals, I changed the former portion of the tutorial from audio data into OSC data (blog post  here ). Since OSC and Audio data are in the CHOP section, and to create visuals are in the TOP section, a 'CHOP to TOP' node is required. Initially, the node is represented by one pixel high and in black and white; to adjust it, simply change the data format if necessary (for the color), and adding a composite node changes the pixel size. A Noise node is added because it is responsible for the output resolution of the project. The Noise node is also where the author attaches the OSC data in the transform section. This all...

TouchDesigner Experiment: Audio Reactive Particle Cloud

My second experiment with TouchDesigner is creating this audio-reactive particle visual by following Bileam Tschepe 's tutorial on Youtube. Again, I just followed his tutorial step by step. This tutorial is a little different because it uses both audio and visuals. The visual follows the music in real-time. Other than audio, we are also introduced with the element 'Math' to add the channels of the audio together. This is the end product. Music is FriendsV2 by Muskatt.