Skip to main content

Posts

Showing posts from 2020

Introducing Halcyon

Halcyon is an immersive installation that allows users to translate their brainwave activity into visual art with the help of Muse headset. This project aims to provide entertainment purposes for the curious mind and research a less explored territory that is brainwave art in digital media. Wearing the Muse headset across the head, it transfers the brainwave to an application called Mind Monitor, which then sends the data to another software called TouchDesigner to project the visuals in real-time. Halcyon, as a brand, is very fluid and abstract, since it produces artwork that is different for everyone. To emulate this goal, the references for this project need to be represented as such as well. Furthermore, as Halcyon is a simple project, I think that mere typography would be sufficient to deliver the aim of the brand. A logo is the keystone of a brand, and it has the power to represent the values a brand upholds. Before proceeding to design a logo, there are some essential rules that...

TouchDesigner Experiment: Inserting OSC data with OSCIn

From one of my last experiment , I tried to change the data input. Instead of using audio, I replaced it with OSC data from my Muse headset. To connect your OSC device to TouchDesigner, make sure that the IP address and port number is the same so the data transfer can be accurate. In this case, I use a third-party app called Mind Monitor (available on iOS devices) to connect my Muse headset to the TouchDesigner software. Below are screenshots and videos from my experiment. You can see that the brainwave data is already recorded in real-time in the software. Then I used the alpha, beta, and theta brainwave to change the movement of the visuals (the chosen brainwave data I used are just placeholders for now to see the movement). Then the data is connected with 'noise', which is like the fluid/abstract visuals you see on the background. I also set the colors to be moving/changing over time.

TouchDesigner Experiment: OSC data by ZIGSIM (Update)

Trying to figure out how to fix the problem I faced on my last post, this time I tried to change the input source into the ZIGSIM-related data shown on the video below.  However, I think the data appointing (which source data is supposed to be which) is still wrong from my side. I have to figure out which is which. Note: The ZIGMSIM on my phone is connected correctly (therefore the two moving graphs on the upper part of the video). Then after trying to changing the source data, the rectangle box is moving but not moving the way I want it to move. It can't really go vertically upwards.

TouchDesigner Experiment: OSC data by ZIGSIM

Since Muse headset can deliver brainwave data in the form of OSC, I wanted to try out how to input OSC data to TouchDesigner. To try it out, I searched for how-to's video and article and found this article posted by the TouchDesigner group themselves. In this post, they mentioned that there are several ways to insert the OSC data stream to TouchDesigner by an iOS application; TouchOSC and ZIGSIM. When I searched for both applications, the TouchOSC app is a paid app, while ZIGSIM is free. So I decided to try ZIGSIM. Unfortunately, the article did not explain in detail on how to use the ZIGSIM app on TouchDesigner. So from there, I searched for any ZIGSIM - TouchDesigner related posts and found this . This post from ZIGSIM explains that we can send motion data from our device to PC. There are basically two 'tutorials' that the article showed. The first one is using another app called ZIG Indicator which can help visualize sensor values. However, ZIG Indicator can onl...

TouchDesigner Experiment: Spectrum to TOPs

This time, the experiment I did with TouchDesigner is creating this audio-reactive abstract wave visual by following Bileam Tschepe 's tutorial on Youtube. This time, Bileam Tschepe's video thought us how to convert an audio spectrum directly to TOPs using the CHOP to TOP and compositing. Following his tutorial, here is my version of the project.  Song by Muskatt .

Designing the final project's visual in TouchDesigner

When creating the visuals for my final project, I was inspired by a 'Spectrum to TOPs' tutorial created by Bileam Tschepe on Youtube (I did a post about it here ). The tutorial teaches the audiences explicitly to convert an audio spectrum directly to TOPs using the CHOP to TOP and compositing. Since Bileam Tschepe used audio as the source information for the visuals, I changed the former portion of the tutorial from audio data into OSC data (blog post  here ). Since OSC and Audio data are in the CHOP section, and to create visuals are in the TOP section, a 'CHOP to TOP' node is required. Initially, the node is represented by one pixel high and in black and white; to adjust it, simply change the data format if necessary (for the color), and adding a composite node changes the pixel size. A Noise node is added because it is responsible for the output resolution of the project. The Noise node is also where the author attaches the OSC data in the transform section. This all...

TouchDesigner Experiment: Audio Reactive Particle Cloud

My second experiment with TouchDesigner is creating this audio-reactive particle visual by following Bileam Tschepe 's tutorial on Youtube. Again, I just followed his tutorial step by step. This tutorial is a little different because it uses both audio and visuals. The visual follows the music in real-time. Other than audio, we are also introduced with the element 'Math' to add the channels of the audio together. This is the end product. Music is FriendsV2 by Muskatt.

TouchDesigner Experiment: Particle Displacement

My first experiment with TouchDesigner is creating this particle displacement visual by following Bileam Tschepe's tutorial  on Youtube. His tutorial is pretty clear on how to create this project, however, I'm going to show a little on how it goes. This is my result.

Trying TouchDesigner

TouchDesigner is a node-based visual programming language for real-time interactive multimedia content, developed by the Toronto-based company Derivative (via Wikipedia ). TouchDesigner is so flexible that it can be used to create audiovisual-based projects, detailed 3d projects, create complex interactive UI solutions, amongst others. I honestly have never heard of this software before being introduced by my lecturer. So far, I have been doing experiments with nodes and looking up tutorials to get familiar with the details. Tutorials I've been following are mostly by Bileam Tschepe  where he thought the basics of using the software to create audio and visual-related artwork. To get a project working, we must always remember that the flow of the network must be arranged in a flow from left to right. Start everything on the left side of the screen, and end the project on the right, preferably into one output (node). To start a project, it requires operations, and to create...

Installing node-osc in npm (node.js)

This post is to update my experiment on the Thinker Blinker project. Thus far, I had been trying to 'install' the appropriate node.js that are necessary for this project to work. The first thing I need to make sure is the file being in the right folder. In the following picture, we can see that the file I wanted to run is called 'index.js' that is located in /Users/christinedinata/index.js . So I typed 'node index' on my terminal/command prompt page (read more from this post I uploaded previously) to call the file to get it running. However, there are some 'errors' because it says that the computer can't find the osc module. Upon more research on how to handle this problem, it turns out another node is needed.  npm install node-osc (more information h ere ) But when I wanted to install this, there are a lot of issues and errors as well.  This prompt lead me to search more on why I can't install this node-osc. This  per...

Website Portfolio Update: Mobile Version

Mobile version of the web folio is also available (works best on smartphones with the width size of 414px; ex: iPhone 6/7/8+, etc).  According to StatCounter Global Stats , the market share for people using mobile phones is larger compared to those using desktops and tablets. The data is from a worldwide region, with 51.74% mobile users, 45.61% desktop users and 2.65% tablet users. This shows that having a mobile version for a website would be the most beneficial thing to do since people would most presumably open websites via their smartphones. However, coding websites from scratch can be difficult. To create both desktop and mobile version needs a lot of resizing and checking on different devices just to see how it looks. I am deeply regretful to say that kweestin.com is not suitable for tablet devices such as the iPad, etc. This is due to the lack of time I have to balance all my school work during these several weeks. Please keep in mind that different devices may ...

Interview: Mulyadi Witono

Mr. Mulyadi (Didi) Witono is the creative director and film director for Milkyway Studio, based in Jakarta, Indonesia. Mr. Witono has a web folio  which showcases the projects he did with Milkyway Studio. Upon stumbling his web folio, he attached his email for business inquiries, and I emailed him to get an interview. Unfortunately, at first, I didn't receive a direct reply, and therefore I messaged his LinkedIn profile , which prompts him to reply to my initial email. The following is the interview exchange. 1. Can you explain what is Milkyway Studio and what do you do in the company? Milkyway Studio is a full-service Indonesian based Film Production Studio dedicated to producing exceptional content for various client needs. We are more than just a production company - we provide additional services in order to ensure that each video we produce is smart, engaging, and achieves its purpose. My role in Milkyway Studio is as a Creative Director / Film Director for all Milkyway...

Using the Johnny-Five platform to control Arduino using JavaScript

When trying to figure out how to actually set the thinker blinker up, the creators specified that they "uploaded the Firmata sketch to an Arduino, then wrote JavaScript code to drive the bulb, using Johnny-five to communicate with it." I then began to search on how to set this Johnny-five up. To understand more about what it is and how to use it, I stumble across this  article. But the point I want to share is this. Julian Gautier then implemented the Firmata protocol, a protocol used to access microcontrollers like Arduinos via software on a computer, using JavaScript in his Node.js Firmata library.  Rick Waldron took it a massive step further. Using the Firmata library as a building block, he created a whole JavaScript Robotics and IoT programming framework called Johnny-Five. The Johnny-Five framework makes controlling everything from LEDs to various types of sensors relatively simple and painfree. This is what many NodeBots now use to achieve some very impress...