October 26th,

How To Co-exist With Technology


How To Co-exist With Technology

In order to realize collaboration with various domains, Dentsu Lab Tokyo occasinally hosts talk session where artists, technologists, data scientists, scholars, writers come together.
Taken from Cai Guo-Qiang’s “Caretta Fountain”, a fountain of turtle, this talk session is named “Kamekai” (which literally translates as the “Turtle Gathering”) and aimed to bring various thoughts and view points among the participants and create an interactive session that won’t become a mere lecture.

We have invited Max Weisel and his friends from Google to Dentsu Lab Tokyo for our very first guest talk series. Together with Bjork, Max has developed Biophilia, one of the first app album in the world. Moreover, his work has been exhibited at New York MOMA and has been making amazing output.

Max Weisel / マックス・ワイゼル

About Max Weisel

Max Weisel has various experience such as his collaboration with Bjork and also exhibiting at New York MOMA. He is also the founder of San Francisco based R&D company, RelativeWave (acquired by Google in 2014) and currently continuing to develop prototyping development environment “Form” at Google.

I’m interested in interface, and I like to think about how humans and computer coexist. Today, I’m going to talk about my own works, and few things that inspire me. I started making things from around 2005, and as a programmer, I started off making my own and my friend’s website using Flash.

My first app I made for iPhone was something that would enable you to download videos from Youtube. It was before the idea of jailbreak was born, and it was even before App Store came into existence.

Next, as a hobby, I started making tools to play with music. Then, just around when I graduated from high school, I got a comment on my blog.

Sudden Invitation from Bjork

The comment said, “Bjork is wanting to make an iPad app, can I get in contact with Max Weisel?”

I thought it was a joke first, but turns out it’s actually true, so I decided to participate on this project. This app is one of the first apps made for a musical album.

he app is available from here.

I will show you 3 instruments that are in the app. “Moon” is made to represent a melancholic feeling. Upon making this feature, I thought about the tide, and fluidity. If you touch the moon, a note appears, and you can change the pitch of the note by going up and down. I made this for the album, but I also aimed so that the user can compose just by using this app.

Simply put, “Solstice” is a galaxy-size harp. From the sun in the center, the user can pull the harp, and once a star hits the harp, a musical note is made. It was originally made for a Christmas song, so the user can display the harp like a Christmas tree as well.
The next feature is something I made for a musical score called “Numa.” The theme behind the music was about how something you can’t see makes a huge impact, so I decided to make something that represents that theme. This music does not have a key, so I decided to make something that visualizes the relativity of the sound. What I’m particularly proud of this piece is that this was the first piece that got exhibited at MOMA as an app.

An Interface to Control an Unprecedented Instrument

After this collaboration with Bjork, initially I was asked to make an app, but it turned out to work with Bjork on everything. I ended up joining the tour with Bjork. She likes to make a lot of instruments, but a lot of them were something hard for human beings to play, and I helped her develop some of those instruments too.

Eventually, I made an app to control those musical instruments, and also played them during the live! I wanted to make the interface as simple as possible, and be able to add on top of it, based on further necessity.


For me, what I made for Bjork gave an opportunity to think about what it means to create an interface for music. After the tour, I realized that, for instance, a guitar is best suited for making the sound as the guitar, but not necessarily best suited for playing the guitar. Since creating something digital enables to transcend that physical restriction, I decided to make the next project to realize that.


What I like about this project is the fact that, although the sound itself is electric, the actual composition you hear sounds very humanized. These things are nice to play, but if you take a look at drum pads for instance, there is not that much difference in terms of interface, and thought that it’s not best suited for composing music. So for my next attempt, I did a project where I can make the interface itself.

Whiteboard Music

Recent Interesting Projects

From here, I’d like to talk about projects that are interesting for me.


This is a project done by my friend. An animator from Disney is actually drawing the character. It’s very difficult to evaluate VR, but this is one of my favorite VR projects.


This project is about having a male and a female to attach HMD and exchange their perspectives. In order to make the experience immersive, each of the action performed on each side must be the same. In terms of the fact that each of the participants can experience what they have not really experienced, this VR project is interesting.

I would like to close my presentation by the following quote.

We need to become better at being humans, learning to use symbols and knowledge in new ways , across groups, across cultures, is a powerful, valuable, and very human goal. And it is also one that is obtainable ,if we only begin to open our minds to full, complete use of computers to augment our most human capabilities.

Doug Engelbart

Thank you very much.

Q&A for Max Weisel

AudienceCan you explain a little further of how White Board Music is structured?
The dot represents a note, and a line represents a slider. The line does not need to be straight, and it could be a curve. If you trace the curve, you can change the figures in an exponential way.
AudienceWithin the same line, one controls the pitch and the other functions as a slider…how are they detected?
There is a software that maps the line and connects that to MIDI. The movie omits that part where it is being mapped with the MIDI.
AudienceWhat do you think of interface within VR, playing instruments for example?
It’s not that I have something in particular, but I’m interested in how feedback is delivered in VR and how it will be when VR transcends the current computer screen. When something like that becomes a social norm, probably the kids around teen would probably not have any detailed knowledge about the instrument itself, but they would be able to share freely of what kind of music they like.
AudienceWas Biophilia the first project for you to make an app that is capable of creating music?
I’ve done an instrument on an app before, but not with this magnitude.
AudienceHow did you end up getting a job from Bjork?
It seems that when Bjork first thought about making an app, the director told Bjork to contact the creator of her favorite app, and it turns out that her favorite app is the app that I made. So that’s why how I got in contact with her.
AudienceThe visual for the app matched the music very much, but did you collaborate with someone?
There is a designer who has been doing the artwork for Bjork and I worked together and discussed what kind of visuals will be suited for the app.
AudienceIs there a framework you’re using to make this, or did you make everything from scratch?
I did occasionally use Openframeworks and Processing. But mostly, I made them from scratch. When I collaborated with Bjork, I told her that I’m capable of making things which I wasn’t capable back then, so I had to push myself.
AudienceUpon making the Pendulapp, how many iPhone screens did you break?
Actually, I was testing it out on my bed, and luckily, I didn’t break any screens! To my surprise, the earphone jack was rather stiff, which prevents from jumping freely.
AudienceRegarding the program for the lot of the sound projects, are you using any framework or is it made from scratch?
It differs by project, but the sound program for Bjork was mostly made by utilizing existing samples.
AudienceAs you mentioned, most of the acoustic instruments are made to represent the sound physically, but not in terms of the interface. Do you think there is an example a digital instrument made for the sake of interface being transformed back to an acoustic instrument?
I’m not really sure what’s out there, and if such an example exists or not, but I personally think that what works perfect as a digital instrument won’t necessarily become a perfect acoustic instrument. Just like in the drawing done in VR, I think that’s impossible to be done in an analog environment.

About Chris Conover

Chris Conover has studied sound engineering at the University of Michigan. After graduating, he is currently working at Google as a UX engineer and developing the next generation of prototyping tool.

How can you experience music if you can’t hear?

Thumping Threads Vest

Originally, I was into music and had been doing a lot of projects with music. In the same manner, I was always interested how it would be like to experience music if you can’t hear. Usually when I’m working with my headphone on and when someone talks to me, I usually hang my headphone on my knee. I realized that I can feel the vibration directly, and thought of utilizing this to for those who cannot hear. I went on to make a garment for those who can’t hear to experience music. This garment has several vibrators that go along the backbone. If the sound has a higher pitch, the higher end of the of the garment vibrates, and if the sound has a lower pitch, the lower end of the garment will vibrate. It will also take the magnitude of the sound into account as well. This garment has a battery inside the pocket. The vibration can be adjusted to best suit the person who will wear this garment. Moreover, I tried to make a software that will enable to compose music, for those who can’t hear.

Lastly, this is more like an observation for me, but I’m interested in how sound affects the design itself. If you look at the messenger app for Facebook, it’s very interesting to see how the sound design is implemented within this app. I would hope I can do something similar to this.

Q&A for Chris Conover

AudienceAbout the garment project, I wanted to ask what the actual feedback was like? I personally did a project, which I tried to send sound to a friend who cannot hear, and when I did that, my friend was surprised by the fact how rich the sound is, in terms of information. Was there something like that when you did this project?
I had 4 users try out this garment, and all of them had positive responses. Someone else asked me why I didn’t add sensor to attach on hand, but I wanted the user to go to club and like other people, so made a decision to hide the vibrators. What made me particularly was that there was someone who said that he wanted to invest in this project. It was really difficult to go into the medical field, due to restrictions and regulations, but his idea was to make this as a product for those who are tired from work come home and use this to experience music at home. This isn’t necessarily designed for the hearing impaired, but also can be used by those are able to hear as well.
AudienceWith regards to the music composing software, what kind of music can be made? Is it something complex, or something rather simple?
The software I made only uses basic sample and capable of making drum sequence, but if you configure to other instruments, it should be capable of making something more complex.
AudienceI’m really interested in what the music would sound like for those who have never “heard” before, and I really hope that you continue this project. I’m interested to see what they perceive as beauty.
Thank you very much.

About Parteek Saran

Parteek Saran works at Google as an interaction designer. As RelativeWave got acquired by Google, he joined Google as well. With the other members, he is currently working on an design prototyping tool.


Today, I want to talk about two projects: the first one is ARTPOP. It is a project done for an album by Lady Gaga.

The interface is very simple, and you can use the turntable to go forward and backwards, and you can choose your favorite part and listen to the album. I tried to maintain an interaction that functions as a ordinary music player for ARTPOP.


Next is ARTHAUS. ARTHAUS is a 3D modeling tool which you can make your own 3D sculpture inspired by the music and share it on social media. There are various effects where you can change color, alter shape and change the background, for instance and design a sculpture in a very simple manner.


I also want to share AURA TIMELINE. This app enables you to view how you interacted with the app, and what music you heard where. You can also remix based on the timeline made before.
Upon making this app, I spent a lot of time on how to make the wireframe. For instance, usually when trying to represent a shadow, a lot of people use the shadow tool on Photoshop, by I used CINEMA4D and meticulously studied how it would affect the visuals.

This “Grove Screen” didn’t get implemented fully, but it was meant to be a tool to visualize how the other ARTPOP users are using the app. Different colors represents different activities which the users are engaging with. For the globe, instead of using 2D image, I layered multiple 3D models.

Prototyping Environment “FORM”

The second project I want to talk about is an prototyping environment called Form.
I think it’s important what tool you use to make something because the tool itself defines the output. Instead of an development environment that is static, I wanted to make a prototyping environment that is something more dynamic, and hence developed Form.

Basically, as you can see, you can define how the UI will work on desktop, and develop further by connecting patches. After you define how the UI will work on desktop and instantly go to mobile device to confirm.

For instance, when making a simple weather forecast app, the data about the weather is from the web, but on Form, you can meticulously control how that data is represented on the app. You can download Form from the AppStore, so please give it a try if you like it.

Q&A for Parteek Saran

AudienceIs there something “unique” about Form? Would there be something distinctive regarding the output from Form?
Form is an environment designed to enable users to instantly test things out. It is made to test your ideas. Regarding the output, Form can control detailed aspects of animation, such as how the spring works, and in what kind of preset. If there is something distinctive regarding the output of Form, that might be it.
AudienceHow did you end up working with Lady Gaga?
Originally, Max’s project got Gaga’s attention, and I got involved as well.
Audience With regards to the feature where the users can share 3D objects, is there a relation to the concept of the album itself, or is it a stand-alone concept just for the app?
Gaga had an opinion that, instead of providing music from the artist to the listener, she wanted to explore the idea of how the listener’s feedback might affect the artist itself in a positive way. In that sense, this feature enabled to reflect the artist’s thought on artistic activity by enabling the listeners to share what they have made.
Audience How do you work with programmers? How do you maintain a certain quality, and proceed the project?
For projects that are complex, I tend to use prototypes to discuss things. It differs on project, but some projects, I use static materials as well. I think it’s important to communicate effectively and know to use what prototypes convey the idea the best, based on the nature of the project.

About Laurel Wagstaff

Laurel Wagstaff works at Google as a design producer. Upon the acquisition of RelativeWave by Google, she also works at Google and continues here work

Image Recognition on images uploaded to FLICKR

Since my partner, Dr. Simon Osindero is working on AI related work at Flickr, I will be talking about that today. He is in a division where he utilizes image recognition techniques to understand what kind of images are uploaded on Flickr.

For instance, let’s say there is a of a dog jumping. There is a fence behind the dog. According to the deep learning techniques, this image will have keywords such as “dog” “garden” fence” recognized properly. However, compared to an image like this, if you take an image with a human figure in it, you will find more interesting results.

Again, for instance, let’s say that there is an image of me together with Simon. Simon has a very distinctive hairstyle, and the system can recognized this as well. The keywords that appear through deep learning are: “person” “face of a male” “beard” “eccentric.”

For my part, keywords appear as: “female” “model.” Simon often compliments on me by saying that “even the computer thinks you are like a model!”

Can Machines Recognize Beauty? Or is it something only for Human?

I think computers can somewhat recognize what beauty is, but I don’t know what it will be like 10 years from now.

Instead of merely recognizing what is beautiful and what is not, I want to also talk about if it is possible to learn a certain artistic style.

As you can see here , although not perfect, it became possible to obtain a style like Van Gogh. I hope it will become a new way to enjoy photographs by being able to take in various artistic styles.

For instance, my mother is an artist, and I think it would be amazing if it is possible to “learn” my mother’s artistic style and be able to apply that to my family photos.

Q&A for Laurel Wagstaff

AudienceLet’s say if there are keywords such as “cute” or “not cute” on what criteria is the system recognizing that? How does the system learn what is cute and what is not?
This is done by studying the activity on Flickr. The system is analyzing how many times did a certain earn favorite, and also reads the comment. Currently, the more comments are made, the system understands that it is a good photograph and therefore have a higher rank as a beautiful photograph.
AudienceI think the examples you provided are all in digital. Do you think there would be a difference if you applied the same method on analog pictures? Do you plan on doing this?
I think that’s an interesting question. I would like to try that.
AudienceRegarding the previous comment on how the system analyzes a certain image, how do you make the system understand what is beautiful, specifically? Is there a crucial data that the system looks into?
Right now, the system is not analyzing the composition of the picture itself, but analyzing data around the photograph such as pageviews and the number of favorites. So it is based on numerical figures by watching how the actual human beings interact with the image itself.
Audience I suppose that there are many companies (i.e. Google Photos) and entities working on the same research topic. Is there something unique about what Flickr is doing? What do you think of the incident Google Photo came across?
Flickr didn’t really care regarding what happened with Google Photo, but I thought it would be interesting to utilize “interesting mistakes” the system might make.

This was the first time to host an event like this, but we had many visitors and had a good interaction between the speaker and the listener. We will be hosting similar events furthermore, so please come back again!



Dentsu Lab Tokyo