As my notes were scattered across a notepad and my laptop, I decided to do a weekly write up of my research and ideas. I documented it on this blog so I could keep a clear track of my progress.
WEEK 1 | MODULE INTRO
My initial ideas after our first seminar were instruments in the web browser. I liked the idea of potentially making a sequencer in the web browser, so decided to do some research on what was already out there. As I'm not a Music Tech student, I wanted to look at sequencers on the slightly more simplistic side, so I could have an idea of what was potentially achievable. I came across the ToneMatrix which is based on a simple 16x16 grid and allows you to create melodies by selecting and deselecting squares. It was really easy to use, and after playing around with making different sequences I began to think of some ideas of how to develop a web instrument like this. I thought how if I created something similar, I could potentially add some kind of sliders that effect the sound. I also thought about ways to make it more interesting visually and liked the idea of maybe using subtle colour changes to represent different octaves.
|
|
After beginning my research, it was quite clear that I needed to brush up on all of my understanding of basic waveforms, synthesizers and filters. Having not done anything music technology based for a few years, I felt I wouldn't be able to achieve anything in a web browser if I didn't revisit the basics. I've scanned in some of my notes that I made from Beau Sievers 'A Young Person's Guide to the Principles of Music Synthesis' to make it clear what I was trying to understand this week.
WEEK 2 | WEB AUDIO
Practising using Code Circle this week gave me a better insight into what might be possible when coding in the web browser. I decided to try and code a simple sequence using different sound waves to reinforce my understanding of how simple code works. The adjacent video is a short clip of my code in Code Circle. However, incorporating music into coding straight away led to me to falsely believe that I understood how things worked. I found that when it came to trying to make more complex things work on my own I had no idea! After class, I decided that I needed to get my head around how to code works before sound is involved. Luckily through some research, I came across a website called Code Academy which helps you learn Javascript with text alone.
Code Academy was an incredibly helpful website to find so early on as it essentially re explained all of the things we'd learnt in class, and makes you practice everything before you can move on. By practising my coding using just text, it really helped cement my understanding of the basic functions and control flow of Javascript. There is a section called 'Learn' where you read the explanation of the lesson you are on. You are then given a series of tasks to code. You type your code into the adjacent window and then hit run. If your code is correct, it will appear in the console and your task box will be ticked and turn green. You then unlock the next task to code, and so on until the lesson is complete. I've attached some scanned pages of my notebook below that demonstrates what I was trying to understand in Javascript this week.
WEEK 3 | SOUND AND DATA
After our seminar this week on data, I knew this wasn't where my interests lie. Therefore, I continued with my research into instruments in the web browser as I thought this was something I would potentially like to do. I knew that week 4's seminar was going to be on mobile devices, so I decided to look at instruments that were accessible via a phone's web browser. I came across a talk led by Norbert Schnell at Loop, which is hosted by Ableton. He discusses interactive music making on the mobile web and allows the audience members to join in via their mobile devices. I found all of the examples really interesting, and I have included one of the examples in the adjacent video where he demonstrates an application that messes around with Queen's 'We Will Rock You'. Different movements are allocated different sections of the song e.g. vocal and guitar parts.
Prior to starting this module, I was aware of 'teamLab' who put on a series of interactive art exhibitions. They believe that "the digital domain can expand the capacities of art, and that digital art can create new relationships between people." A lot of their work centres around nature, and by this point in the module I feel as if I've to settle on the idea creating something using natural sounds for my project. I love their art installations, and they recently brought 'Trancending Boundaries' to London shown in the adjacent video, which I unfortunately missed out on tickets for. I have included a link to their website below the video so you can see some of their other works. |
I started to think about how I could create something interactive that used samples to build up an audio environment that wasn't based on melodies or building a song. I was interested in the example given to us of CoSiMa's 'ProXoMix Prototype Application'. I liked the idea of it interacting with space, and how each person effected another persons sound experience. So with this application as a starting point, I tried to develop an idea that could incorporate sounds other than music. I also wanted to explore how you could make something like this more visually engaging so my instant reaction was to research into interactive art installations. |
I thought about a concept of a room with some basic projections on the wall of an outdoor environment such as a woodland, real life plants could potentially be placed in the room as well. I then had an idea creating an app similar to the ProXoMix that would have different nature sounds e.g. birds or wind allocated to different movements of the phone. I really liked the feature of everyone interacting with each other, so I thought that having that work within the app too would be a nice idea so that the environmental sounds changed depending on where you were in the room.
WEEK 4 | MOBILE AUDIO
Unfortunately I was absent for week 4's lesson on mobile sound due to being unwell. This threw me off some my previous ideas as we seemed to be learning one area we could potentially produce our project in per week and I no longer felt like I wanted to pursue anything mobile based. Knowing our following week's seminar was going to be based on games, I decided to start thinking about games that I had previously played and enjoyed that were more puzzle based, as it seemed like creating a simple puzzle game would be more of a realistic goal. The first few that came to mind were Portal 2 and Little Big Planet, so I got these games out and played them paying attention to the different ways in which sound was used.
I started to think about simplistic ideas for games that could be made in unity for my project. My initial idea was some kind of ball rolling game that moves around and jumps across planes that move. I was thinking of how to incorporate basic sounds, e.g. a certain sound effect for jumping. I also thought about adding in some kind of point scoring system to add more room for sound effects, so thought about sound effects used in games such as Sonic the Hedgehog, where you collect coins, or the sounds you have in Little Big Planet when you walk over the bubbles.
|
|
WEEK 5 | GAME AUDIO
I decided to start with my simple ball rolling game idea that I had last week. Wanting to develop this further, I decided to look at what other people had made using Unity. I watched various YouTube videos that were similar to my idea for some inspiration, and have included a few below that range from basic to much more complex.
|
|
WEEK 6 | VIRTUAL REALITY
Working on VR in our seminar this week confirmed for me that I wanted to work in Unity on a 3D game, as I felt VR was a bit out of my depth. I decided to research into a different type of game idea as I didn't want to settle with the first thing that came to mind and felt it was important to still be experimenting at this stage. I watched some gameplay from the newest Zelda game that was released earlier on this year, The Legend of Zelda: Breath of the Wild. This re-inspired my earlier thoughts that centred around using nature sounds. I now started thinking about the possibility of creating different audio zones, and how different areas would require different audio environments, e.g. in a cave you would need reverb.
|
|
After researching further, I decided to start a new project in unity. I found a tutorial series online for an RPG style game, so decided to follow this to get an environment set up as it used a free package from the asset store for all of the objects. It then talked you through typing the scripts which I found useful in familiarising myself with popular terms used in scripts for games in general. I did encounter a few errors, as with each Unity update they can often change how certain parts of code are scripted, therefore this took me longer to set up than I anticipated as I did not know enough to figure out the new scripting on my own. I managed to create a basic environment modelled on this tutorial, and began to implant simple things we’d learnt in class into it, such as creating triggering audio clips using colliders. Once it was set up with a player that worked, I added a soundtrack to the plane I created for the ground object. I then developed a box collider over a crystal object that triggered a twinkly sound clip when the player entered the collider. I'm considering changing my player view from first person to third person as I don't like the way the mouse moves the camera around and think it will look better if its in third person.
WEEK 7 | UNITY CONTINUED
With our presentations being a week away, I decided that my concept and research needed to be very clear and concise, so I didn’t develop my game much further this week either. I took on further research into how audio proximity works, and how to create different audio zones as I knew this was something I wanted to implement. I took a look at Unity’s audio mixer, but didn’t begin to learn how to use this, as a lot of the tutorials online weren’t based on much scripting and this was the main focus of our module. The video to the right shows where I've got to with my unity project so far, and this is what I will present in our presentations next week. I decided to change my controller to a third person view as I did not like the way the first person camera moved around my map.
|
|
WEEK 8 | PRESENTATIONS
After seeing everyone else’s presentations, I felt like I had a better idea of the level of work I was aiming for and what was possible. I had good constructive feedback on my project so far, and felt happy continuing with this idea so continued to develop my map in Unity. I figured that the quicker my map was built, the quicker I could start implementing my codes.
WEEK 9 | LEARNING UNITY | GAME DEVELOPMENT
I felt like a lot of the tutorials I'd watched prior to our presentations were heavily based around elements that weren't very relevant to my project, and delved too far into the game mechanics than general scripts that would help me with my project. I decided again to go back to tutorials as I was still slightly confused with even the basics due to overloading myself with complex code and was finding it hard to take scripts that I sourced via tutorials and online forums and adapt them to do what I wanted. I started with focusing on triggers and booleans, and have included some screen recordings of two exercises I did from a tutorial along with the scripts. The first video disables the cubes one at a time, and the second disables them together using if statements. As I was focusing on my other module this week, I wasn’t able to develop my game environment much further.
I then moved on to create a script that would allow my player to pick up objects as I was hoping to create some kind of system that would allow me to trigger different audio clips and slowly build a soundtrack. I wanted to build something based around triggers that would allow me to pick up an object, and then put it down it in a certain place set up as a trigger similar to the example I showed earlier in this blog from Portal 2. I wasn’t yet sure how to do this, so again had to watch a lot more tutorials.
I then moved on to create a script that would allow my player to pick up objects as I was hoping to create some kind of system that would allow me to trigger different audio clips and slowly build a soundtrack. I wanted to build something based around triggers that would allow me to pick up an object, and then put it down it in a certain place set up as a trigger similar to the example I showed earlier in this blog from Portal 2. I wasn’t yet sure how to do this, so again had to watch a lot more tutorials.
WEEK 10 | GAME DEVELOPMENT | LEARNING UNITY
Unfortunately I misplaced my USB with my adventure game project on it, so had to start over again. I immediately went back to my roll a ball idea, as I’d already watched tutorials on this and done quite a lot of research, so felt like I would still be able to achieve something of a good level with only two weeks until the deadline. As much as I could have downloaded a pre created game world, I felt like I would find it much easier to create my scripts and have a much better understanding of how to make them work together if I created it myself. Things as simple as knowing what objects are in the game world and naming/grouping them myself would make all the difference, as it would minimise potential errors in my scripts that might clash with something already existing that I didn’t know was there.
I used the same low poly asset store package that I had used for my adventure game to create my world. This was because I was already familiar with what was in the package, and felt I could create something aesthetically pleasing without being to complicated e.g. not having to mess around much with textures and materials to get everything to match. I wanted to create a few short levels in order to avoid getting wrapped up in creating a complex environment that was aesthetically pleasing and wasting crucial time. I thought it was best that each level had a different theme in order for me to implement different effects via code, e.g. reverb and filters. I started to lay out a basic map using bridges and platforms so I had a basic environment to work with. I imported a gemstone asset pack that I found online, which I laid out as collectable objects and added an audio trigger script to. I then created a text UI so I could add a counter for the collectable gemstones to give them purpose within the game. I changed the counter in the player script so that the 'win' text would appear when you collect 10 or more gemstones.
I wanted to create a moving platform to make my game a bit more interactive, so started to look up tutorials online for how to make game objects move in Unity. I found a tutorial online, which I used to get my platform moving from A to B. I got the platform moving, however once my player jumped onto it, it didn't move with it. The solution to this problem took a while to figure out, as even though I was following a tutorial, other elements in my game were effecting it. I finally realised through lots of trial and error that the scale of my player had to be 1 x 1 x 1 otherwise it would deform once it was held on the moving platform by my 'Hold Player' script. I've attached examples to the right, and my scripts are attached below.
|
|
WEEK 11 | GAME DEVELOPMENT | LEARNING UNITY
ROTATION
I decided for my first level to be an island theme, so I changed the sky box to a warmer preset that I downloaded.I wanted to make the gemstones rotate, so I created a script that allowed them to rotate on the Y axis which was really simple which I've attached to the right. BUTTERFLIES I then started on setting up an area for my proximity audio. I still wanted to keep my level relatively simple, so created an area that has a few trees to represent a forest. I then added butterflies to represent birds as i couldn't find any birds in the asset store. I wanted the birds to fly around my 'forest' area, so I used the same script I made for the moving platform as a starting point. I created more empty game objects that I set as triggers so I could create more positions for my butterflies to move between. The way I had previously coded the moving platform left too much room for spelling errors in my code when having this many positions to move between, so I had to find another way to try and script this movement. I came across enumerations, and found that this was a much better way to script movements with 8 positions. I set it up using public transforms which meant I could assign all of the positions in the inspector. I have attached copies of my script, as well as what my inspector looks like to the right. |
PROXIMITY AUDIO
I found a tutorial online that showed how to script proximity audio on a small scale, so I followed it to ensure it worked before attempting to scale it up on my project. This would allow me to add audio to a certain part of my map, and it is triggered by the distance of the player from the object I've attached the script to. I went on to adapt the script so I could add an audio source in the inspector and a second range.
I found a tutorial online that showed how to script proximity audio on a small scale, so I followed it to ensure it worked before attempting to scale it up on my project. This would allow me to add audio to a certain part of my map, and it is triggered by the distance of the player from the object I've attached the script to. I went on to adapt the script so I could add an audio source in the inspector and a second range.
|
|
WEEK 12 | GAME DEVELOPMENT
I created my second level with a spooky theme, and managed to develop this quite quickly as I used the same asset store pack for the objects in my map and used the scripts from my first level as a basis, making a few adjustments, but this was mainly on the clips used. I a few new scripts for this level, one of which I added onto the rotating coffin and the second inside the cave. I added a random audio clip script onto the rotating coffin as I thought it would be a surprise trigger (not obvious that it any sound would be triggered at this point). I added three 'spooky' clips to it. Inside the cave I created a reverb script with random elements. This script triggers and then the floor falls through, and I thought these worked nicely together as a sound/game interaction. I have included the scripts below.
iMy third level was space themed, and I found an asset store pack for teleporters which was perfect as I wanted to make this level have more of an interactive feel. I couldn't get the script to work so I had to create my own, but I used the teleporter pad objects by themselves in my game. I set up my script to I could assign things in the inspector such as transport location, what object you transport and the text UI when you enter into the triggered zone. I then set up a box with one wall missing and placed a sphere infront of it. I attached a script to make the sphere pushable, as I intended to push it into a trigger that would turn up the volume on an audio clip. I then planned to replicate this on each level until there were enough clips to make a soundtrack for this level. I set up a second counter in this scene so players could tell the aim of the level was to activate 5 triggers. After completing this level I plan to test it on family and friends before submitting my finished project.