July 12, 2013

Quick Update

We've let Jony Ive give Nurbits a flat redesign for iOS 7!


Just kidding... those are just some placeholder quads unit we get the final models in, but it looks kind of nice doesn't it?

We've been quiet here on the development blog but things are progressing nicely! We've recently started on the final version of the game. We also spent some time working on a music composition tool, which we hope to tell you about in a future post and possibly even release for people to play around with when we're sure it's finished. On that note (pun?) we are proud to announce that we're partnering with super talented local game music studio Audio Aggregate to do sound design and music composition for Nurbits. They are probably best known for their work on the soundtrack of The Gunstringer, and we are very excited to hear what they come up with for Nurbits!

May 24, 2013

Puttin' the Robot Band Together


So,
A robot band.
What else would an artist want to make, really? I tried to list my top 10 things I want to make as an artist, and Robot band was #3 on the list just below “sexy alien chick” and any kind of Star Trek fan art. So essentially number 1. (Since those other two are a given)

The fact is, I might be a bit out of my league here. I do not really think of myself as a character artist, nor have I done any real work for real-time rendering. So why would they want me as the sole artist on this game?
No, really. I'm asking.
Just kidding.
The truth is I have some real dirt on the NIH and IS3D and blackmail can go a long way. Now, on to some concepts...

Before we get into the robots though, just let me say I have not textured any of these robots yet, so the renders I have are all ambient occlusion renders out of Maya, except for the synth robot who I slapped a quick texture on so you could see the piano keys (since they are not built in to the geometry).

The first robot I made was a ghetto robot. This robot will be the piano player. From a story perspective, he (yeah the robot is a “he” does that make me sexist? I don't think so) was the first robot built for the band, because... well... he was the first robot I built for the band. I imagined when this robot band first got going the budget wasn't there. I therefore made him out of other household items. A lacrosse stick for an arm. (well it's in some households) A file cabinet for a body, and lamps for “eyes”. The feet are bit less realistic. I made the feet like on a wind up toy, mostly because my coworker suggested it, and I thought it was hilarious. How's he gonna play piano with one hand? I'm an artist, don't bother me with pesky things like logic and realism.



The 2nd robot I made was awesome. I'm not gonna say anything else about it nor am I going to post any picture of it, because that's all you need to know about it. Trust me. It's awesome.

The 3rd robot was the synth piano player. I made him an actual synth piano. I love the idea of the robot actually playing himself. Talk about made to be a musician! I essentially made a Korg MS-20 and added an arm. I also put a lamp on top for much the same reason as the lamps on the first robot, to simulate a face. For some reason, this is my favorite robot so far.



The 4th robot I call Cyl. Because he was made from a cylinder. (Yeah, not very creative, I know) Cyl is a combination of a few Star Wars characters. The first being R2D2. Perhaps the coolest robot of all time. Also I made a hat-like head that was taken a bit from Bane's (from Clone Wars the series) hat that he wears. It is not too much like Bane's hat, but Bane was just somewhat close to the look I wanted for Cyl. Cyl is the bass player and is a bit short. He's got tiny little legs and really long arms perfect for slappin out the phat bass licks.



The 5th robot is the guitarist. This robot is an android, more so than the others. As Data from Star Trek TNG will tell you (yeah I'm geeky enough to use the three letter acronym. If you don't know what “TNG” stands for, I'm not sure I'm actually allowed to tell you. Just know that you'd probably feel out of place at Comic-Con) An android is a robot that looks like a human. While the other robots have anthropomorphized parts, this robot is the most human like. I did this mostly because there are many guitar type poses that necessitate a human. The sliding on the knees while jamming on the guitar for example. (Done very well by Michael J. Fox in “Back to the Future”. “Marty, that was very interesting music”) And pretty much anything done by Angus Young. Plus, seeing as at present there is no plan for implementing vocals, the guitarist will be the “frontman” of the band and I thought it might be better for animations in general, if he looked and moved like a human. And for all those G 'n R fans out there, I made him have a “bucket head” get it?



In the next art post I will discuss how great I am, and other important aspects of the art for Nurbits like the music instruments and backgrounds.

May 17, 2013

Prototype 3: Welcome to the Studio


Hello Everyone. Todays post will take you through the different layers of our gameplay for our 3rd prototype. This prototype was built as a precursor to starting work on our final product, so some of what is seen here should influence how our game looks in the end. This prototype has helped us understand much of the basic functionality that needs to be implemented, and gave us a chance to try different things until we could figure out what worked or played the best.


This prototype begins to bring in more of the story elements surrounding the puzzles that Stephen showed in our last post. From that last prototype we were able to build up a more complete experience and a system where short musical compositions could be created. The prototype and story begins within a studio environment, with a band of robots. Each robot has its own instrument, and for them to be able to play, the user must venture into their brains to complete puzzles that build up the audio composition. This studio area has little gameplay, other than controlling a few bits of the entire song, but it will bring a lot of personality and customization to our game.

From the studio, a user must enter the brain of a robot. The brain scene is where the bulk  of composing the song happens. Here users can sequence together the loops that they create, along with connecting effects to any loops they wish. These connections are like hooking the robots senses to its motor control systems in order to play the notes that have been pieced together.
The brain contains audio nodes and effect nodes that represent different regions of the brain and how they function together. The audio nodes house the puzzle aspect of our game. Completing these puzzles opens up loops that can then be used in the brains of any robot. We plan to have many pre-made loops/puzzles with varying difficulties while using different components of neuroscience. These pre-made loops will lead the user to creating a good sounding composition and then to being able to create their own loops to be played and unlocked for further creativity in song making.
To take you through each of these bits of our game I have created a video walkthrough. 


As always we have a demo here, if you want to give a go at playing a song with your own robot band. They may only look like cubes now, but just wait until the art gets put in, in later iterations. So, look forward to that.

May 10, 2013

Prototype 3: grid puzzle progression


For our third prototype we were trying to think of a better way to integrate music with our puzzle system. We decided to try a grid based step sequencer, which would work pretty well with our new UnitySynth based dynamic music system.

Chips could be placed on the grid, and chips of different widths could be used to represent notes of different lengths: eighth notes, quarter notes, half notes and whole notes. Like our previous prototypes, chips would have a threshold to reach before sending their signals on to the next chips in the neural network, and there would be an initial stimulus chip that would start that signal. We also created a relay chip to extend connections between note chips, much like interneurons in the brain. There are also excitatory and inhibitory chips that bring other chips towards or away from threshold.




In this prototype we decided chips would connect to each other based on proximity, unlike our previous prototypes that had to be deliberately connected. After implementing these ideas we designed a series of puzzles that attempt to introduce these concepts to the player gradually.



You can play this prototype in your web browser now!

This prototype formed the foundation of what we are currently planning for the core gameplay mechanics of our final version of Nurbits. It's undergoing some evolutionary improvements as we iterate, but the core idea is present and felt interesting to us in this prototype.

May 3, 2013

Audio tools


From the very beginning Nurbits was conceived as a rhythm puzzle game, and therefore music is a critical part of the game. As we mentioned before, Unity3D our game engine of choice has limited support for precise musical timing. As a result we spent some time evaluating all the possible ways we could implement the music system in Nurbits. Here we'll tell you about some of the options we looked at, and their advantages and disadvantages. This is going to be relatively technical.

Using prerecorded loops

For awhile we debated basing our music system around prerecorded loops of audio. The major advantage of this approach is that they music would sound really good. This is a good approach for most games, where the music is only a background element and loops can be faded in and out as the player navigates the environment or events occur. However, Nurbits is supposed to be a music game, and the major disadvantage is that the user would have very little creative control over the music other than arranging loops. We would be limited in the amount of interactivity we can have between the player and the music, we would only be able to make the loops play, stop, or switch to another loop. We also debated a hybrid approach where we have a lower quality dynamic system that eventually switches over to the higher quality loops when the player finishes a puzzle and unlocks a loop. We still really wanted to have a fully dynamic music system that involved more player creativity, so we continued looking for a better solution...

Mod tracker files

Unity now supports mod tracker files, which are really nice because they are able to create high quality audio without taking up much file size. They do so by storing small samples of waveform data and arranging and sequencing those samples to create longer pieces of music. We got particularly excited about XRNS2XMOD, a free utility that converts files from Renoise (modern DAW style tracker software) to the xm and mod formats that Unity can read. Unfortunately Unity treats tracker files the same way as they treat normal PCM audio files like .wav and .mp3s. If Unity exposed access to programmatically mix the different channels and switch between patterns in a mod file that would be really powerful. As it stands you can only play, stop, and loop them. We briefly contemplated trying to write our own mod files in real time, but it turned out to be a bit more trouble than we wanted to get into, as mod files are a somewhat complex binary format and we weren't sure what complications might arise trying to get those dynamically written mod files into Unity's asset pipeline.

PureData

Pure Data is a visual programming language that can be used to do some really powerful audio processing, including building soft synths from very basic low level modular building blocks. We found a few open source libraries that integrate libpd, an embeddable version of the core PD libraries, with Unity. Kalimba integrates libpd for iOS and Android builds, and libpd4unity integrates it for Windows. There are also a ton of PD patches freely available online that do really cool and powerful things. However, using these open source libraries would limit the platforms we could target, unless we did some significant development to make those libraries support other platforms, and there is no way to support web builds. We really want to avoid platform specific native code plug-in development if possible. Also none of us have ever used PD before, so there is a learning curve there as well.

Fabric

In Unity 3.5 Unity introduced the Monobehavior.OnAudioFilterRead callback function. It essentially lets you read and write directly to the audio buffer, allowing you to write custom effect filters, or create procedural audio. This gives developers the power to essentially do anything with audio, albeit at the very lowest level. Someone had to be writing some powerful tools on top of that at a higher level, right? After digging around we found Fabric. Fabric itself is a Unity editor extension that adds a very powerful dynamic audio mixing system, we assume piping that audio through the OnAudioFilterRead callback. In addition we got very excited about the Fabric modular synth add-on. The ability to add all of the power we saw in PD to manipulate real time synthesis in our game, but do so within Unity was extremely appealing to us. Unfortunately we contacted the developer and the synth extension was still in development. The timing just didn't match up for us to use it on this project.

UnitySynth

We came across UnitySynth on the Unity forums, a port of the open source C#Synth project to Unity. At first we played around with it and kind of wrote it off because the main function it seems to fulfil when you first look at it is to play .midi files. Midi files are cool because they are small in file size, but they aren't known for their amazing audio quality. UnitySynth uses sound fonts, which much like the mod tracker files we looked at before use small samples to play notes. We found that there are sound fonts available that have higher quality samples than the ones included with UnitySynth. After some hacking we found that we could get UnitySynth's FM synthesis to give us the real time control of it's parameters that we liked when looking at PD and the Fabric modular synth extension. After more hacking we figured out that we could give UnitySynth essentially fake midi files to play, by creating and arranging our own midi events. We managed to get UnitySynth to play each midi channel on a separate AudioSource so that we could apply Unity's built in DSP effects on each one individually. With enough modification we eventually got UnitySynth to do pretty much everything we wanted.


April 26, 2013

Prototype 2: Summation

For our second Nurbits prototype I was attempting to explore a mechanic based on the neuroscience concept of summation. Summation is the way that signals combine from one or more incoming connections to bring the neuron's action potential to threshold, making it fire. I was also trying to see what it would be like to give the player more direct control over when loops get played, rather than playing based on a timer like Brian's previous prototype.

design sketch

As you can see this prototype continues with our concept of pie slices representing neurotransmitters, and has a similar connection mechanic to the previous prototype. I wanted the player to be able to use both types of summation: temporal and spatial, which I feel this prototype achieved.


prototype screenshot

Temporal summation is when a neuron receives multiple signals quickly over time to bring it's action potential to threshold. The player can induce temporal summation in this prototype by connecting up a single input node and hitting the red button quickly to bring the output node to threshold to play a loop. Spatial summation is when a neuron receives signals from multiple other neurons that combine to bring the action potential to threshold. In this prototype that can be done by connecting up and hitting multiple buttons.

Here's a video showing this prototype in action:


If you'd like to play this prototype yourself you can do so over here.

I briefly mention at the end of the video that this prototype uses mod tracker files for it's music. We'll have more information about that, and the other options we looked at for our dynamic music system, in our upcoming audio tools post.

April 19, 2013

Prototype 1: Chips and Wires


Hello Everyone. My name is Brian and I am the other half of the programming team on Nurbits. Today I will take you through the design process of the first prototype for our game.

Once we settled on the pie slice game mechanic, I began prototyping what I thought it might look like as a basic step sequencer like puzzle game that relied heavily on the computer chip as neuron association outlined in our grant proposal. I began work with a chip model that had on it one of our 3 patterns of differing time scales. Here are the 3 pattern representations shown below, each with a different pattern, in yellow, that must be hit for a chip to reach its threshold to fire. The patterns not only represent the threshold that must be reached, but also the pattern of signals in which that chip fires on to whichever chip it is connected to.



The patterns on the chips represent beats in a measure of a loop of a song, with each “slice” on the pattern representing either a quarter, eighth or sixteenth note. Having these different note lengths/pattern sizes would help not only with increasing the difficulty throughout play, but also allow for more varied and unique musical composition.

Each chip also has a number of inputs on the left and an output “wire”, on the right, that could be hooked to another chips inputs. The example below shows a 16 beat pattern with two inputs.


The puzzle I came up with for this prototype involves connecting a string of chips together from the starting “stimulus” chip to a goal chip at the other end. The stimulus chip is there to give one or many starting signals to the puzzle, each with their own pattern corresponding the pattern of a chip. The player must recognize and match these patterns with the correct chips, and once a chip is successfully connected, the chips audio will play in time with its pattern. 

To better show this I have made this below video to take you through a level of this prototype. 
Once you've watched the video and understand the basics of our gameplay, feel free to try to play the demo yourself, which we have provided here (Unity Web Player required to play). 

Creating this first prototype took less than a week, which may be clear if you made something unexpected happen in the demo, but it was very helpful not only for me to get my ideas out, but also for the team to have some starting point to critique and iterate on. This prototype was used as the jumping off point for some different gameplay ideas and some more prototypes to come. I hope you enjoyed this look into our development process and remember to keep a lookout for our next post and playable prototype.


March 21, 2013

Pie slices


In that last blog post I mentioned that we were struggling with how to represent our core action potential mechanic. We needed some way to represent the way the player uses neurotransmitters to get the chips/neurons in the game to their threshold so that they fire.

After much brainstorming we came up with an idea. What if instead of combining colors and shapes, we represented the neurotransmitters spatially as the sections of a divided circle, essentially pie slices?

An early design sketch of our pie slice idea.

The pie slices on the circle would indicate which neurotransmitters are excitatory and take the chip/neuron's membrane potential towards threshold, and also which are inhibitory and count against reaching threshold. To get the right chip/neuron to fire you would have to carefully connect up your neural network to combine the right neurotransmitters, making strategic placement essential.

Another sketch, brainstorming temporal and spatial summation.
With this idea in mind we started working on a series of prototypes, which we plan on showing you in our next few blog posts.

March 15, 2013

Design concepts outlined in the grant

We are a small team, two programmers (myself included) and an artist, and we all collaboratively work on the game design. Our biggest challenge is making a game that is fun and interesting that is also true to the goals and intent of the grant. The grant gives us a solid foundation for our game design, but there is still much to be done. There are so many details that contribute to the design of a game that only come into existence when you start actually developing the game. I personally like this about game development, because it allows a lot of room for creativity for everyone involved.

Logo concept art from the grant.

I managed to whittle down the core design ideas outlined in the grant into a bullet pointed list:

  • Nurbits should be a casual game to appeal to a wide audience -- the grant references the Popcap hits Bejewled, Peggle, and Plants vs Zombies.
  • As the player solves puzzles there is an aesthetic change:
    • Visually the puzzle pieces go from looking like chips on a circuit board to looking more like neurons.
    • The music in the games transitions from a chip-tune like style to more full and natural sound.

Concept art of chip to neuron transition from the grant.

  • The core gameplay involves the player placing the appropriate neurons to connect an existing structure.
  • Core gameplay will revolve around identifying the logic used by the neuron’s cell body to initiate action potentials that meet the requirements of a downstream "goal" neuron.
  • The grant mentions the player changing the color to represent the neurotransmitters used by the neurons, and the brightness of the color representing the threshold and rate of fire.
  • Higher level play requires the use of inhibitory interneurons.
  • Nurbits will also have a "sandbox" that can be unlocked if the player progresses to a certain point, which will allow students to create their own neural systems and audio tracks.

This gives us a lot to work with, but it also leaves us with some problems to solve. The first one is a technical problem that will deeply effect our design. How will we implement our music system? Based on previous experiences we've found that Unity, our game engine of choice, has limited capabilities for precise low level audio timing. Using Unity's built in audio system we could use prerecorded loops, which would sound really good but limit the player's creative possibilities. There are also a number of more powerful audio tools we looked at, but they each involved significant technical challenges and some potentially limited the platforms we could target.

Concept art of chip with color and shape based logic from the grant.

The second problem is a design issue at the very core of what the gameplay will be like in Nurbits. How will the action potential mechanic work? The grant mentions changing colors to represent neurotransmitters that would bring the neuron to threshold and initiate action potential. However, there is a major problem with this idea, it is not accessible to the colorblind. The original Nurbits design doc suggested combining shapes as a solution, but we felt this might be difficult for the player to understand at a quick glance.

How did we resolve these problems? Stay tuned for our next blog post!

February 28, 2013

What is Nurbits?

Nurbits is the lastest project currently in development by IS3D (Interactive Science in 3D) and funded by an NIH SBIR grant. The goal of the project is to create a music puzzle videogame that teaches the principles of neuroscience through play. By building neural networks to solve puzzles players will experience the way that neurons reach action potential and fire to transmit signals through our brains, while at the same time creating music.

We plan to use this blog to document our process of creating Nurbits. Stay tuned for upcoming posts on our design and prototyping process, as well as video and playable versions of some of our current prototypes.