Producing

Jukebox Beatdown, Producing, Games, Design, Programming, XR

Blog Post #4: The Finish Line

This most recent sprint was one of the most challenging but fulfilling sprints yet!

The end of this sprint coincided with the final submission deadline for the Oculus Launchpad program, so it was time to finish the project’s remaining essential features while also building out promotional materials and responding to playtest notes. Needless to say, I didn’t sleep much. 😊

The Oculus Launchpad program only distributes grants to a handful of its more than 100 participants based on the merit of their vertical-slices, so my demo had to be looking and playing its best. I didn’t have the money to continue building Jukebox Beatdown beyond my vertical slice, so the project’s survival depended on the grant.

For my project to be eligible for Oculus’s consideration, it had to meet the following requirements:

  1. Provide a complete experience in five minutes or less.

  2. Meet the Oculus Store’s technical standards, especially in terms of performance.

  3. Be accompanied by at least a basic press kit.

In this post, I’ll discuss how I reached each of these goals, which are divided nicely into the categories of Design, Engineering, and Marketing:

  1. DESIGN: Building the game’s five-minute demo by scripting the attacks of Funky Franny, the game’s first boss.  By doing so, we hoped to achieve the game’s central promise of boss-fights set to music.

  2. ENGINEERING: Optimizing the game so that it met the Oculus Store’s technical requirements (and didn’t trigger VR-nausea).

  3. MARKETING: Building a professional and exciting presence for the game online through my press kit.

The final banner art for the project.

The final banner art for the project.

Design

This was far and away the most fun sprint of this game’s production because I finally had the technical foundation to fulfill the game’s central promise: boss battles set to music.

Like the movie Fantasia, we wanted all the bosses in Jukebox Beatdown to have their actions choreographed to music.

This was an intimidating goal. The song I commissioned for this project had multiple saxophone, horn, drum, synth, bass, guitar, string, and keyboard tracks each with either a unique melody or rhythm. It was a bit overwhelming to decide which sound would be paired with which attack.

To make this process easier, Ben Young, the game’s composer, designed our song so that it had four distinct, self-contained phases. We thought that these self-contained phases would make it easier for the player to notice when the music matched with the action of the game.

A screenshot of the Ableton project for the game’s song.

A screenshot of the Ableton project for the game’s song.

To cut through the noise (pardon the pun), I made a list with two columns for each of these four sections. One column had all the tracks in that section (saxophone, guitar 1, guitar 2, etc.) and the other column had a list of attacks I could create for the game. This list was an anthology of riffs on attacks I had seen in other games with great bosses, such as Dark Souls 3, Cuphead, Titan Souls, and Furi, plus some inventions of my own.

From there, it became a jigsaw puzzle of music, visuals, and gameplay. Using the music, which was basically set in stone, as a starting point, I tried to pair the attacks and music together. The design process was complex and went like this: what fits a guitar melody better, a flurry of lasers or a series of rapidly-exploding bombs? If I tie the lasers to the guitar, that means I can’t use them for the saxophone, so what should I tie into the saxophone? If the lasers are more difficult than the bombs, maybe they should be with the saxophone, which comes in predominantly at the end of the song – but now I am using the bombs for the horns, so what should go with the guitar now? Moving one element displaced another.

This process of experimentation continued until we had built out the first several phases of Franny’s fight:

However, now that we had added all this new content, our framerate had dropped into the teens. It was time to put our engineering hat on and optimize the project.

Engineering (Optimization)

From an engineering perspective, VR development can be one of the most challenging forms of game development. One of the main reasons that VR development is so challenging is that the slightest amount of lag can make someone sick. If you’ve ever played a new game on older hardware, you’ve probably noticed the game’s graphics lagging for a moment before updating to where they should be. This is never desirable but is especially unwanted in a VR context because the slightest amount of lag can cause the player to get nauseous.

The dangers of lag.

People get nauseous when the stimuli that their eyes are receiving does not match the stimuli that the vestibular system in their ear is receiving. (The vestibular system gathers and transmits data about our body’s position through several organs in our ear.) To minimize the difference between what these stimuli, Oculus requires all desktop-connected VR content to run at 90 frames per second.

Not my diagram. I would credit this but the original link is now a 404.

Not my diagram. I would credit this but the original link is now a 404.

Unfortunately, after my first design pass, my game was running at around 15 to 30 FPS at worst on my mid-range Razer laptop. To hit 90 FPS, I had to use many optimization tricks, including:

  1. Using object pools as I mentioned in my previous blog post.

  2. Eliminating almost every unnecessary UI canvas object from my scene in Unity as they were constantly being recalculated, putting unnecessary stress on my CPU.

  3. Eliminating almost every dynamic light in the scene and replacing it with lightmapping, which is essentially the practice of “painting in” a scene’s lights and shadows beforehand than simulating those at runtime.

However, the most impactful step for me was reducing my draw calls.

A draw call is the directions that a CPU (Central Processing Unit) gives to the GPU (Graphical Processing Unit) about what to render in a scene. Specifically, the draw call is the information that the GPU needs to render each object in the scene. While most computers’ GPUs do not struggle to execute these directions once received, preparing these directions puts significant strain on the CPU, which results in lag.

To use a filmmaking metaphor, you can imagine the CPU as a location scout and a GPU as a lightning-fast set-builder. In the context of the metaphor, the CPU is visiting the “real location” and sending instructions back to the GPU on what to build in the movie-set. The CPU/location scout looks at objects that make up the “real location” and communicates every visual detail about them to the GPU/set-builder, who recreates them.  However, to extend the metaphor, the CPU/location-scout is using a slow fax machine, so sending these details to the GPU/set-builder takes a long time and can slow down the entire process. Thus, the more succinct the directions can be, the faster the GPU will be able to build the set. We’ll use this metaphor as a way of explaining some of these optimization techniques.

Below is a timelapse that shows a scene in Jukebox Beatdown being rendered one drawcall at a time.

To reduce my draw calls, I used two techniques: mesh-baking and texture-atlasing.

Mesh-baking is when a developer takes several meshes (3d models) in their level/scene and turns them into one mesh. If we bake three meshes into one, our CPU will now need to process one draw call for those three meshes instead of three. In the context of Jukebox Beatdown, we generally baked together most meshes that shared the same shader, which is code that dictates how an object reacts to light. Our floor, for example, was made of about sixty different meshes; we baked these into one object.

To continue our movie metaphor, now that the meshes are baked together, our CPU/location-scout can describe, for example, a group of ten stereos as a group of ten stereos rather than communicate information about each stereo one-by-one. Put another way, it’s the difference between counting bowls of rice versus counting individual grains of rice. Preparing and sending the directions is the bottleneck in this context, so using succinct instructions is paramount.

Texture-atlasing is the process of aggregating all a game’s textures onto one image file. If a project is not texture-atlassed, every texture in the game is contained within a unique image file. The problem with this setup is that as the number of unique images go up, the number of draw calls go up as well. So, in order to minimize the number of images that need to be sent by the CPU, developers will pack as many textures as they can onto one image or texture atlas. The GPU will then look at this atlas for every texture that it needs.

In our location-scouting metaphor, texture-atlasing would mean that instead of taking pictures of every scene in our metaphorical “real-location” and sending them through the slow fax machine, our CPU is instead sending one page that contains all the pictures.

TextureAtlasForVirtualCity.jpg

A texture atlas for the buildings in a virtual city.

All these changes together helped us reach our technical performance goals. Now, it was time to make sure our project’s marketing was as engaging as the project itself.

Producing

The Oculus Launchpad program is highly competitive, with only three to five grants awarded to around fifty entries. I knew that some of the other entrants had teams that were five or more people strong (compared to my two) and had significantly larger budgets than I did, so I knew my project needed to look as polished and exciting as possible.

At the Oculus headquarters in San Francisco area for the presentation day.

At the Oculus headquarters in San Francisco area for the presentation day.

For my project to receive the grant, it had look professional. I knew that I had the skills to reach the required level of polish as a designer and a programmer, but maybe not as a voice-actor, graphic designer, or 3D modeler. Even if I did have the skills for those later responsibilities, I knew I didn’t have the time to do them all.

I had a $200 and a month to get the project ready for Oculus.

To ensure that I would get everything I needed by the grant deadline, I made a list of what features and assets would constitute the vertical slice of Jukebox Beatdown and then planned backwards from that date (February 28th). I then prioritized each of these items by how important they were and how much lead time they would need in order to be completed. My scrum training came in handy here.

From there, to decide which items I would outsource, I took a “time is money” approach. I estimated how long each item would take in hours if I did them myself and then multiplied that number by my hourly pay. I then compared how much it would cost to pay someone to do the same job on Fiverr. When a task was significantly cheaper to due via Fiverr, I outsourced it.

Ultimately, I ended up outsourcing a considerable amount of work, including the voice-acting, poster design, logo design, and character artwork. I spent $200 between several vendors to get all these items. This amount of work took about two weeks to be delivered.

In the gallery below, you can see the final artwork followed by the original sketch I sent to the Fiverr artists:

To do so, I started using Fiverr, a freelancing website, which was a great experience. If you decide to use Fiverr for your work, consider using my referral code, which will give you 20% off your first order: http://www.fiverr.com/s2/5b56fb53eb

Full disclosure: I will receive Fiverr credits if you sign up with the above link as part of their referral program. I am not affiliated with them in any other way. My opinions are my own.

Presentation

With my game built, optimized, and marketed, it was time to present to fly to Menlo Park, CA and present the game:

next Steps (Several Weeks Later)

Unfortunately, Jukebox Beatdown did not win any of the Launchpad grants. However, I am very happy with the project and will be polishing the vertical slice for release on itch.io so that others can play it.

Amongst other small tweaks, this next sprint will be about:

  1. Adding more attacks to Funky Franny’s moveset.

  2. Adding a system that ranks the player’s performance within the boss battle like this.

  3. Giving the game more “juice,” which means to make the game feel more visceral.

Thank you for following the development journey of Jukebox Beatdown! You can expect one or two more blog posts about the project followed by a release on itch.io!

Jukebox Beatdown, Design, Games, XR, Programming, Producing

Jukebox Beatdown Development Blog #3: Hitting Reset

This past month working on Jukebox Beatdown has been demanding but productive: I rebuilt the game’s mechanics, reconstructed my codebase, and recruited new team-members. In this blog post, I will update readers on new features within Jukebox Beatdown. Along the way, I will also talk about the challenges I faced and how I approached them as a designer and software engineer.

(The section entitled Engineering is structured more like a tutorial, so feel free to skip if you are not interested in computer science.)

Want more Jukebox Beatdown? Join our mailing list:

Overview

My design process is very player-centric, so playtesting (asking friends to give feedback on the game) is a crucial part of my process. My friends’ feedback provides direction for the next phase of development. If you have been following this blog, you may remember that my most recent playtest session gave me three primary notes:

  1. The game needs more “juice.” In other words, there needs to be more feedback for the player’s inputs. More simply, the gameplay is not satisfying.

  2. If the game is going to be marketed as a rhythm game, music needs to be a bigger part of the game’s mechanics.

  3. It needs to be clear that the player’s hands are the player, not their head. Alternatively, this mechanic needs to be reworked.

  4. Most importantly, the core gameplay loop (“boss fights synced to music”) sounds compelling, but the most recent execution of that idea is not engaging players.

This blog post will cover three main topics: Design, Engineering, and Producing.

An still from the previous iteration of Jukebox Beatdown. Dr. Smackz, the giant boxer pictured above, punched to the beat of Mama Said Knock You Out by LL Cool J.

An still from the previous iteration of Jukebox Beatdown. Dr. Smackz, the giant boxer pictured above, punched to the beat of Mama Said Knock You Out by LL Cool J.

Design

I am generally a bottom-up designer, which means that I try to find a fun, exciting, and unique mechanic then build other aspects of the game (story, art, engineering) around that mechanic.

While the above adjectives are subjective terms, there are a few concrete questions that can confirm the presence of each:

  1. Fun: If a player plays a prototype of my game, do they want to play it a second time?

  2. Exciting: When someone hears the elevator pitch for my game, do they ask a follow-up question?

  3. Unique: When hears someone the elevator pitch for the game, do they assume it is a clone of an existing game? (I.E., “So it’s basically Beat Saber?”)

As I mentioned in my previous blog post, Jukebox Beatdown was passing the “Exciting” test but failing the “Unique” and “Fun” tests. People were hooked by the pitch but bored by the gameplay. They also perceived the game as being a Beat Saber-clone, when it was actually a rhythm game crossed with a bullet-hell game.

Beat Saber is one of the most famous VR games and by extension, the most famous VR rhythm game. Due to its fame, I wanted to steer clear of any mechanic that resembled Beat Saber too closely.

Beat Saber is one of the most famous VR games and by extension, the most famous VR rhythm game. Due to its fame, I wanted to steer clear of any mechanic that resembled Beat Saber too closely.

Given this, I decided it was time to start over and try to create a new mechanic that engaged players and incorporated music. If I could not create a new mechanic that passed these requirements in two weeks, I would put the project on ice.

My previous game’s most popular minigame revolved around using tennis rackets to keep falling eggs from hitting the ground. It was a weird game, but a fun one!

My previous game’s most popular minigame revolved around using tennis rackets to keep falling eggs from hitting the ground. It was a weird game, but a fun one!

My last VR game had had success with a basic “bat” mechanic (you had to bounce eggs falling from the sky with tennis rackets), so my first inclination was to prototype a version of that mechanic that could work in the context of Jukebox Beatdown. I created a “Beat Bat” that the player could swing at enemies. If they hit the enemy on the beat, which was designated by a bullseye-like icon that shrunk as the beat approached, they would get a critical hit.

A very quick screen-grab of the Beat Bat prototype. It didn’t feel intuitive to me and it felt too similar to Beat Saber.

A very quick screen-grab of the Beat Bat prototype. It didn’t feel intuitive to me and it felt too similar to Beat Saber.

As a player, I found this mechanic difficult and awkward. It also felt too much like Beat Saber, so I went back to the drawing board once more.

My next idea was to have the player shoot on the song’s beat in order to apply a damage multiplier to each bullet. I was worried that this mechanic would feel shallow, but I also figured it would be accessible to less musically-inclined players, so I built a quick prototype. My first prototype rewarded the player with a more powerful bullet when they shot on the beat in 4/4 time regardless of whether a note was played at that time. I liked this mechanic, but it felt too basic and unsatisfying.

To learn how to make the rhythm mechanic more compelling, I decided to study existing rhythm games made for both VR and traditional platforms. I studied Audica, Beat Saber, and Rez Infinite, but by far the most useful game to play was Pistol Whip. Whip was useful partially because it had a similar shoot-on-the-beat mechanic, partially because its implementation of that idea was frustrating, and partially because it was built for a different kind of audience. These elements made me think of how Jukebox’s mechanic could be different and, I thought, more satisfying to its audience. (As a side note, all of the games mentioned above are excellent. My notes on the games below reflect my personal tastes rather than my opinion of the game’s quality. They are all extremely well-crafted.)

Below were my main takeaways and notes from playing those games:

Pistol Whip:

Pistol Whip is a on-rails VR shooter in which you get additional points when you shoot on the beat.

Pistol Whip is a on-rails VR shooter in which you get additional points when you shoot on the beat.

  • The shoot-on-the-beat mechanic always applied even if there was no instrument playing on the beat. This created awkward moments in which you had to shoot on the beat but there was no way for you to know when the beat was occurring besides a faint visual tremor in the level’s environment art and a haptic effect on your controller.

  • The shoot-on-the-beat mechanic served no strategic purpose; this made it less compelling. As far as I could tell, there was no incentive to shoot on the beat besides raising your score. I felt that this limited the appeal of the game to people who care about high scores. As someone who never felt strongly about leaderboards, this made the mechanic less interesting to me. (There are people who love this kind of mechanic, so points and leaderboards are a great idea; I just felt the mechanic was a missed opportunity.)

  • The feedback for shooting on the beat was too subtle: when you shot on the beat in Pistol Whip, the only feedback you got was a red dot above the enemy you shot. This felt like a missed opportunity to reward players.

  • Your hitbox was your head: in Pistol Whip, you are your head. In other words, to dodge the enemy’s bullets, you need to move your head around. I’m not a fan of this design because:

    • I personally dislike standing up for long periods of time to play a game.

    • I worry about banging my head against my furniture.

    • My hands afford me finer control than my head does.

    • This control pattern makes the game inaccessible to some players with disabilities.

Audica:

Audica is a music-shooter from Harmonix, the creator of Guitar Hero. It was one of my favorites.

Audica is a music-shooter from Harmonix, the creator of Guitar Hero. It was one of my favorites.

  • A rhythm-game can be made more interesting by requiring the player to hold down the trigger at times. This was a mechanic I had not seen in many other VR rhythm games and which I may incorporate into Jukebox Beatdown in the future.

  • Audica has fantastic particle feedback for every successful hit. Particle feedback is highly satisfying.

Rez Infinite:

Rez Infinite is a critically acclaimed music-action-adventure game in which your shots are timed to the music.

Rez Infinite is a critically acclaimed music-action-adventure game in which your shots are timed to the music.

  • Rez Infinite made the interesting choice to ensure that the player bullets always hit the enemies on the beat by having the player lock-on to enemies and then fire homing missles rather than shoot them directly. When the beat played, the missles would fire out of the player and hit the locked-on enemies so that it appears that the player has hit the enemy in perfect time with the beat. I want to recreate this effect with the homing missles Jukebox Beatdown’s bosses will use against the player.


With these notes in mind, I built a new prototype with changes that I felt made the gameplay more interesting:

  • Shooting-on-the-beat became a risk vs reward mechanic. If a player shot on the beat consistently, they would be awarded an increasing damage multiplier: their first shot on the beat would multiply their damage by two, their next shot would multiply their damage by three, and so on. However, if the player missed the beat or was hit by an enemy, their multiplier would reset to one. This gave players two options: they could either time their shots to the beat in pursuit of a high damage multiplier (but have to lower their firing rate to do so) or they could ignore the multiplier and simply blast away, making up for their lack of damage per bullet with a higher firing rate.

  • Shooting-on-the-beat was made more dramatic. As the player shot on the beat, their bullets would grow larger and change color. Additionally, a Guitar Hero-esque combo counter was tied to the player’s hands.

ComboMultiplier.gif

With the shoot-on-the-beat mechanic on firmer ground, it was time to incorporate more music into the boss’s attack patterns. In my previous prototype, I had programmed the prototype’s boss, a giant boxer, to punch the player on the beat. However, almost none of my playtesters perceived that the boss’s attacks were tied to the music and some even expressed that they wished that it had been! 

It was time to turn things up a notch. I felt that if players did not recognize events on the beat, they might recognize specific actions tied to specific notes. With some engineering secret sauce, I put together a pipeline that automatically tied notes to specific game-events. For example, a C# note could fire a rocket while a B note would shoot a laser.

The red bullets that the Beat Brothers are dodging are triggered by notes within the song’s melody.

The red bullets that the Beat Brothers are dodging are triggered by notes within the song’s melody.

However, as I will note in the Looking Forward section, this change was still too subtle for players to notice.

Engineering

This section is more technical in nature and resembles a tutorial. If you are not interested in computer science, feel free to skip this section.

After I implemented the above mechanics, I found the game had two big problems: poor performance and significant disorganization. 

Performance:

Due to the high number of projectiles on screen and some funkiness with my music-syncing plugin, Jukebox was crashing within ten seconds of start.

To fix this, I implemented a game programming pattern called an Object Pool. The purpose of an Object Pool is to enable a program to generate and manipulate large groups of in-game objects without adversely affecting performance. Large groups of objects can cause problems because the operations for Creating and Destroying these objects are computationally expensive, especially when executed many times per frame. To sidestep this issue, the Object Pool instead generates objects at program-start then places them within a “pool” of similar objects. When one of these objects is required, it is activated and moved to where it needs to be. Once it is no longer needed, it is deactivated until it is required once more. This saves performance significantly because it removes the need to perform many expensive Create and Destroy operations.

In the case of my game, this pattern was a lifesaver because the gameplay evolved to include up to 80 bullets on-screen at any given time. With this pattern in place, I was able to eliminate crashes.

As the bullets go offscreen, they are deactivated and returned to the pool. From Ray Wenderlich.com, one of the many resources I used to learn how to create an Object Pool. Click on the link above to learn more about Object Pools.

As the bullets go offscreen, they are deactivated and returned to the pool. From Ray Wenderlich.com, one of the many resources I used to learn how to create an Object Pool. Click on the link above to learn more about Object Pools.

Organization:

Once I felt more confident about the project’s direction, it was time to refactor my code.

During prototyping, my code had gotten messy and found myself losing time because I was writing multiple versions of the same few functions for similar classes. For this reason, I decided to create some basic Inheritance Trees.

If you are not familiar with Inheritance Trees, it is a manner of organizing code so that it incorporates basic is-a relationships. Is-a relationships are useful because they allow us to define objects using abstract terms rather than stating every attribute of every object from scratch. The textbook example is classifying animals:

Assume that you do not know what a Dog is, but you do know what an Animal is. If I tell you that a Dog is an Animal, you may, for example, know that an Animal is living and that it can reproduce, so a dog must also be able to do those things. Rhinos and Chickens, by virtue of being Animals, must also have these attributes.

Animals_InheritanceTree.png

Assume now that you know what a Dog is but you do not know the traits of particular Dog breeds. If you know a Dog has ears and four legs, you can assume that a Greyhound, Chihuahua, and Great Dane do as well. That is the value of an Inheritance Tree: it enables you to define objects/classes without having to repeat yourself.

To write the inheritance tree for my health functionality, I took all my health classes and wrote down all their properties and functions. I then identified which properties and functions I wanted in every object that had health. These “essential” functions and properties were then put into my most basic health class. After this class was written, I worked myself “down” the tree, creating increasingly more specific classes as necessary.

One advantage of an inheritance tree like this is that it helped me enforce certain design standards. For example, I wanted every object to play a sound effect and spawn an explosion effect when it “died” so that combat always felt dramatic. By defining this functionality in my base Health class, it was included in all the sub-classes (descendants of the base Health class like EnemyHealth, PlayerHealth, BossHealth) so I did not have to remember writing the same functionality for every sub-class. 

Producing

One of the more challenging aspects of this project has been finding appropriate and compelling music on a tight budget.

Fortunately, I’m excited to announce that Benjamin Young, a composer whose credits include Love, Death, & Robots and Star Wars: The Old Republic, will be composing and recording an original disco song for the game’s vertical slice! Check out more of Ben’s awesome music here!

For the last bit of exciting news, I’m happy to say that we have a new logo! In my next blog post, I will introduce our new poster and concept art as well!

LogoPNG.jpg

Looking Forward

After implementing the above changes, I hosted a playtest party in December to see how players would react.

The response from the twelve play-testers was generally positive and it was clear that the next move would be augmenting these changes rather than changing direction once more.

 The critical feedback grouped around three main points:

  • The shoot-on-the-beat mechanic could be made more complex. In its current iteration, the shoot-on-the-beat mechanic is tied to the snare drum section of Disco Inferno. Some playtesters felt that the rhythm section be made more complex.

  • The boss’s attacks need to be made more interesting. At present, the boss follows a simple attack pattern in which he moves around the stage then does a special attack in which he spins while shooting many green pellets. This needs to be made more interesting.

  • The player needs more interesting choices within the game-loop. The risk-vs-reward dynamic in the shoot-on-the-beat mechanic is interesting, but I can give the player more opportunities to make interesting choices. For example:

    • Create “beat zones” that reward the player additional points when the player hits the beat within them. Make this a risk vs reward mechanic by putting the zones in dangerous spots.

    • Build a mechanic around holding down the trigger.

    • Reward the player for alternating which Brother/hand shoots on the beat.

  • There needs to be clearer feedback for the player’s actions. Combos, hits, and boss attacks need more visual and audio feedback.

Thank you for following the development of Jukebox Beatdown! To get blog posts, beta-invitations, and launch coupons for the game delivered to your inbox, sign up for our mailing list below:

Certificates, Producing

I have a become a Certified ScrumMaster!

Brett Moody-ScrumAlliance_CSM_Certificate.jpg

In addition to being a game designer or engineer, one long term goal of mine is to become a video game producer! To accomplish this goal, I recently asked a few game producers I know what I could do to prepare myself to be a producer. They said:

  1. Learn Jira and Confluence.

  2. Become a Certified ScrumMaster.

To accomplish the latter, I decided to attend a ScrumMaster class this weekend and take the CSM exam. I got a perfect score, so I am now an official Certified ScrumMaster! (I also got a Jira certificate this week.)

What is Scrum?

To read the official description of Scrum, click here.

Scrum is a project management framework used to deliver complex products. It is built to avoid common problems in complex projects like:

  • Excessive crunch.

  • Building a project that nobody wants.

  • Running out of budget or time.

Without using the Scrum lingo, I would break down the Scrum process into the following high-level steps:

  1. Generate ideas for potential features for your product.

  2. Prioritize a list of those features.

  3. Determine a length of time shorter than a month and longer than a week during which you will work on those features.

  4. Select the highest priority features that you can commit to finishing within that allotted amount of time.

  5. Work on those features for that allotted amount of time, no more or no less.

  6. Present your completed work to relevant stakeholders.

  7. Retrospective: Facilitate a discussion with your team how about they can work together better in the future.

  8. Repeat, skipping step three. You generally use the same amount of time for every sprint of work.

Throughout this process, you also periodically refine your list of items to complete.

I am purposely not using the Scrum lingo above so people new to Scrum can more easily understand the process as a whole. I also left out a number of important steps for simplicity.

If you are curious about Scrum, please reach out to me or check out the official Scrum website.

Certificates, Producing

I Completed a Course on Jira and Confluence Administration!

In addition to working as a game designer or engineer, one of my long-term goals is to become a game producer, so I recently asked a few professional game producers I know what I could do to get to that position. They said:

  1. Learn Jira and Confluence.

  2. Get your ScrumMaster certification.

  3. Make more games!

To do the first step, I decided to take the above online course on Jira and Confluence administration. If you are not familiar with Jira or Confluence, they are two of the most-used tools for teams creating large software projects. Jira helps teams plan their workload while Confluence allows them to organize their documents.

Coincidentally, I will be getting my ScrumMaster certification this weekend. After that, I will be continuing work on the other games mentioned in this blog!