May 24, 2013

Puttin' the Robot Band Together


So,
A robot band.
What else would an artist want to make, really? I tried to list my top 10 things I want to make as an artist, and Robot band was #3 on the list just below “sexy alien chick” and any kind of Star Trek fan art. So essentially number 1. (Since those other two are a given)

The fact is, I might be a bit out of my league here. I do not really think of myself as a character artist, nor have I done any real work for real-time rendering. So why would they want me as the sole artist on this game?
No, really. I'm asking.
Just kidding.
The truth is I have some real dirt on the NIH and IS3D and blackmail can go a long way. Now, on to some concepts...

Before we get into the robots though, just let me say I have not textured any of these robots yet, so the renders I have are all ambient occlusion renders out of Maya, except for the synth robot who I slapped a quick texture on so you could see the piano keys (since they are not built in to the geometry).

The first robot I made was a ghetto robot. This robot will be the piano player. From a story perspective, he (yeah the robot is a “he” does that make me sexist? I don't think so) was the first robot built for the band, because... well... he was the first robot I built for the band. I imagined when this robot band first got going the budget wasn't there. I therefore made him out of other household items. A lacrosse stick for an arm. (well it's in some households) A file cabinet for a body, and lamps for “eyes”. The feet are bit less realistic. I made the feet like on a wind up toy, mostly because my coworker suggested it, and I thought it was hilarious. How's he gonna play piano with one hand? I'm an artist, don't bother me with pesky things like logic and realism.



The 2nd robot I made was awesome. I'm not gonna say anything else about it nor am I going to post any picture of it, because that's all you need to know about it. Trust me. It's awesome.

The 3rd robot was the synth piano player. I made him an actual synth piano. I love the idea of the robot actually playing himself. Talk about made to be a musician! I essentially made a Korg MS-20 and added an arm. I also put a lamp on top for much the same reason as the lamps on the first robot, to simulate a face. For some reason, this is my favorite robot so far.



The 4th robot I call Cyl. Because he was made from a cylinder. (Yeah, not very creative, I know) Cyl is a combination of a few Star Wars characters. The first being R2D2. Perhaps the coolest robot of all time. Also I made a hat-like head that was taken a bit from Bane's (from Clone Wars the series) hat that he wears. It is not too much like Bane's hat, but Bane was just somewhat close to the look I wanted for Cyl. Cyl is the bass player and is a bit short. He's got tiny little legs and really long arms perfect for slappin out the phat bass licks.



The 5th robot is the guitarist. This robot is an android, more so than the others. As Data from Star Trek TNG will tell you (yeah I'm geeky enough to use the three letter acronym. If you don't know what “TNG” stands for, I'm not sure I'm actually allowed to tell you. Just know that you'd probably feel out of place at Comic-Con) An android is a robot that looks like a human. While the other robots have anthropomorphized parts, this robot is the most human like. I did this mostly because there are many guitar type poses that necessitate a human. The sliding on the knees while jamming on the guitar for example. (Done very well by Michael J. Fox in “Back to the Future”. “Marty, that was very interesting music”) And pretty much anything done by Angus Young. Plus, seeing as at present there is no plan for implementing vocals, the guitarist will be the “frontman” of the band and I thought it might be better for animations in general, if he looked and moved like a human. And for all those G 'n R fans out there, I made him have a “bucket head” get it?



In the next art post I will discuss how great I am, and other important aspects of the art for Nurbits like the music instruments and backgrounds.

May 17, 2013

Prototype 3: Welcome to the Studio


Hello Everyone. Todays post will take you through the different layers of our gameplay for our 3rd prototype. This prototype was built as a precursor to starting work on our final product, so some of what is seen here should influence how our game looks in the end. This prototype has helped us understand much of the basic functionality that needs to be implemented, and gave us a chance to try different things until we could figure out what worked or played the best.


This prototype begins to bring in more of the story elements surrounding the puzzles that Stephen showed in our last post. From that last prototype we were able to build up a more complete experience and a system where short musical compositions could be created. The prototype and story begins within a studio environment, with a band of robots. Each robot has its own instrument, and for them to be able to play, the user must venture into their brains to complete puzzles that build up the audio composition. This studio area has little gameplay, other than controlling a few bits of the entire song, but it will bring a lot of personality and customization to our game.

From the studio, a user must enter the brain of a robot. The brain scene is where the bulk  of composing the song happens. Here users can sequence together the loops that they create, along with connecting effects to any loops they wish. These connections are like hooking the robots senses to its motor control systems in order to play the notes that have been pieced together.
The brain contains audio nodes and effect nodes that represent different regions of the brain and how they function together. The audio nodes house the puzzle aspect of our game. Completing these puzzles opens up loops that can then be used in the brains of any robot. We plan to have many pre-made loops/puzzles with varying difficulties while using different components of neuroscience. These pre-made loops will lead the user to creating a good sounding composition and then to being able to create their own loops to be played and unlocked for further creativity in song making.
To take you through each of these bits of our game I have created a video walkthrough. 


As always we have a demo here, if you want to give a go at playing a song with your own robot band. They may only look like cubes now, but just wait until the art gets put in, in later iterations. So, look forward to that.

May 10, 2013

Prototype 3: grid puzzle progression


For our third prototype we were trying to think of a better way to integrate music with our puzzle system. We decided to try a grid based step sequencer, which would work pretty well with our new UnitySynth based dynamic music system.

Chips could be placed on the grid, and chips of different widths could be used to represent notes of different lengths: eighth notes, quarter notes, half notes and whole notes. Like our previous prototypes, chips would have a threshold to reach before sending their signals on to the next chips in the neural network, and there would be an initial stimulus chip that would start that signal. We also created a relay chip to extend connections between note chips, much like interneurons in the brain. There are also excitatory and inhibitory chips that bring other chips towards or away from threshold.




In this prototype we decided chips would connect to each other based on proximity, unlike our previous prototypes that had to be deliberately connected. After implementing these ideas we designed a series of puzzles that attempt to introduce these concepts to the player gradually.



You can play this prototype in your web browser now!

This prototype formed the foundation of what we are currently planning for the core gameplay mechanics of our final version of Nurbits. It's undergoing some evolutionary improvements as we iterate, but the core idea is present and felt interesting to us in this prototype.

May 3, 2013

Audio tools


From the very beginning Nurbits was conceived as a rhythm puzzle game, and therefore music is a critical part of the game. As we mentioned before, Unity3D our game engine of choice has limited support for precise musical timing. As a result we spent some time evaluating all the possible ways we could implement the music system in Nurbits. Here we'll tell you about some of the options we looked at, and their advantages and disadvantages. This is going to be relatively technical.

Using prerecorded loops

For awhile we debated basing our music system around prerecorded loops of audio. The major advantage of this approach is that they music would sound really good. This is a good approach for most games, where the music is only a background element and loops can be faded in and out as the player navigates the environment or events occur. However, Nurbits is supposed to be a music game, and the major disadvantage is that the user would have very little creative control over the music other than arranging loops. We would be limited in the amount of interactivity we can have between the player and the music, we would only be able to make the loops play, stop, or switch to another loop. We also debated a hybrid approach where we have a lower quality dynamic system that eventually switches over to the higher quality loops when the player finishes a puzzle and unlocks a loop. We still really wanted to have a fully dynamic music system that involved more player creativity, so we continued looking for a better solution...

Mod tracker files

Unity now supports mod tracker files, which are really nice because they are able to create high quality audio without taking up much file size. They do so by storing small samples of waveform data and arranging and sequencing those samples to create longer pieces of music. We got particularly excited about XRNS2XMOD, a free utility that converts files from Renoise (modern DAW style tracker software) to the xm and mod formats that Unity can read. Unfortunately Unity treats tracker files the same way as they treat normal PCM audio files like .wav and .mp3s. If Unity exposed access to programmatically mix the different channels and switch between patterns in a mod file that would be really powerful. As it stands you can only play, stop, and loop them. We briefly contemplated trying to write our own mod files in real time, but it turned out to be a bit more trouble than we wanted to get into, as mod files are a somewhat complex binary format and we weren't sure what complications might arise trying to get those dynamically written mod files into Unity's asset pipeline.

PureData

Pure Data is a visual programming language that can be used to do some really powerful audio processing, including building soft synths from very basic low level modular building blocks. We found a few open source libraries that integrate libpd, an embeddable version of the core PD libraries, with Unity. Kalimba integrates libpd for iOS and Android builds, and libpd4unity integrates it for Windows. There are also a ton of PD patches freely available online that do really cool and powerful things. However, using these open source libraries would limit the platforms we could target, unless we did some significant development to make those libraries support other platforms, and there is no way to support web builds. We really want to avoid platform specific native code plug-in development if possible. Also none of us have ever used PD before, so there is a learning curve there as well.

Fabric

In Unity 3.5 Unity introduced the Monobehavior.OnAudioFilterRead callback function. It essentially lets you read and write directly to the audio buffer, allowing you to write custom effect filters, or create procedural audio. This gives developers the power to essentially do anything with audio, albeit at the very lowest level. Someone had to be writing some powerful tools on top of that at a higher level, right? After digging around we found Fabric. Fabric itself is a Unity editor extension that adds a very powerful dynamic audio mixing system, we assume piping that audio through the OnAudioFilterRead callback. In addition we got very excited about the Fabric modular synth add-on. The ability to add all of the power we saw in PD to manipulate real time synthesis in our game, but do so within Unity was extremely appealing to us. Unfortunately we contacted the developer and the synth extension was still in development. The timing just didn't match up for us to use it on this project.

UnitySynth

We came across UnitySynth on the Unity forums, a port of the open source C#Synth project to Unity. At first we played around with it and kind of wrote it off because the main function it seems to fulfil when you first look at it is to play .midi files. Midi files are cool because they are small in file size, but they aren't known for their amazing audio quality. UnitySynth uses sound fonts, which much like the mod tracker files we looked at before use small samples to play notes. We found that there are sound fonts available that have higher quality samples than the ones included with UnitySynth. After some hacking we found that we could get UnitySynth's FM synthesis to give us the real time control of it's parameters that we liked when looking at PD and the Fabric modular synth extension. After more hacking we figured out that we could give UnitySynth essentially fake midi files to play, by creating and arranging our own midi events. We managed to get UnitySynth to play each midi channel on a separate AudioSource so that we could apply Unity's built in DSP effects on each one individually. With enough modification we eventually got UnitySynth to do pretty much everything we wanted.