Monday, February 24, 2014

INFR 4350 - Midterm Review

So Reading Week is over. That's unfortunate considering I really didn't get much done in terms of studying or assignments. Spent a lot more time working on other things which I guess is ok though. Regardless, I've been looking over some of the notes for HCI and decided that this blog will be dedicated to some of the stuff I learned in the first half of the course and how I'm already applying some of it.

Some of the biggest topics that we've covered in the class so far have been: user centered design, prototyping and evaluation. All of these concepts go hand in hand because at the end of the day, designers need to design something that users want. How we know what they want is through iterating and researching the effectiveness of our designs.

A data system I designed and proposed back in January showing a flowchart of data movement.

All of those concepts are something that I was familiar with before coming into the course, but the course goes more into depth into them. This has given me a better understanding of how things need to be done. Especially in terms of evaluation and design processes.

One thing that I've been doing more of lately is more low-fi prototypes and designs. Previously I subscribed to the idea of just working on it until it's done. But nowadays I find myself opting to go for a more structured approach with diagrams and everything. For example, I recently made an interactive design for one of the projects in my social network class.

You know you've been web devving too much when you start to open everything in Notepad++ first. You can check out the thing here.

We are supposed to be designing a website. At first I was planning on just drawing out a series of pictures in my notebook, but then I thought back to one of our lectures and how we saw a video of some Koreans testing their paper prototype by moving around sheets of paper. I thought: "Hey, I could do that. But with Flash since that's less effort." so I did.

I think it turned out pretty decent. Obviously it's not the best thing ever since I did it really quickly, but you get a sense of how the layout is and how it's supposed to navigate. Going forward I'm going to be brainstorming and researching a bit more and refine it so I get a better prototype.

Next iteration is going to have colours. #StayHyped.

Overall, I feel like in terms of information, this course has given me a more in depth look at things I already knew. It fleshed out ideas that I had in my head already and stressed the importance of being concise and immaculate with designs and evaluations.

It's also given me a new perspective on certain things. One thing that I really enjoyed for the course was the activity with the Legos two weeks ago. That was a cool activity where we got into groups and had one guy control the others in making a Lego structure. That was pretty fun and it gave me new ideas into how clear and concise you have to be when explaining things to your users.

GDSoc can't afford Legos unfortunately.

Well, here's to my final half semester of UOIT Game Dev.

Monday, February 10, 2014

INFR 4350 - Iterative Design in StarCraft II

One of the topics we covered this week was on iterative design using evaluation. Iterative design is when you take a product and improve on it through iterations. Every iteration is tested and evaluated and then improved on based on that feedback.

In a way, I feel that an online game like StarCraft II is a good example of iterative design. Since it is an online game, patches are often deployed to improve the balance and enjoyability of the game. In a way, a game like StarCraft II is never truly "finished development" since it can be continually monitored and improved on. 

As evident by changes such as Hellbat nerfs. (

The way I see it, as long as the game is still supported by Blizzard, it's always in a formative evaluation stage. This means that they will continually check to make sure the game still meets the player's needs. When they stop supporting the game, then it can enter the summative evaluation stage and the results of that evaluation can go into making StarCraft III a better game.

In the lecture slides, they mentioned "why, what, where and when to evaluate". "Why" is to check the user's requirements and to see if they can use it or like it. In terms of StarCraft II, that means they want to test whether or not the players like playing their game and like watching the tournament streams.

I quite enjoy watching tournament streams.

"What" denotes a conceptual model or early prototypes. This could be representative of the game's state after every patch. When making historical comparisons, a lot of people in the community refer to things by their patch number, similarly to how human history is divided into eras. Each new patch builds upon the old one, so in a way each patch is a prototype for the next.

The next one, "where", refers to the setting of where the evaluation takes place. As far as I know, Blizzard's testing is either external or internal. External testing is just an analysis of data from player's game data and the opinions of the community. Internal testing is done by their in house testers as well as any professional players or community figures that they invite for in person feedback.

The result of careful evaluation. (Youtube)

Lastly, "when" was covered earlier. Evaluation occurs constantly throughout the game's life cycle. As long as the game is still supported by Blizzard, formative evaluation will occur.

Overall, iterative design is a thing that I wish I did more of. Unfortunately time restraints during the school year mean that not a lot of that can be done. Hopefully in the future I can do more. I always find it fascinating how a product changes as it is being developed.

Monday, February 3, 2014

INFR 4350 - Nielsen's Heuristics

Last week we discussed Nielsen's heuristics in class. They are essentially a list of heuristics that can be used in the design and development of a piece of software to make it better. The way I see it is that they can essentially act as a generic checklist that can be used to make sure your software covers the most basic software faults.

For the purposes of this blog entry, I'm going to list all ten of them and give an example of when I encountered them in the past and how I dealt with them (if I did). The first one is "visibility of system status", or keeping users informed about what's going on. Back in first year second semester when we worked on our 2D game, I was often annoyed as to how the game felt like it froze when it loaded the game. As a result I added a simple line of text that let the player know that the game was loading, which suddenly made it better.

Your fate is controlled by a single line of text.

The second heuristic is "match between system and real world" which means that the software's information is displayed in a way that is understandable to the average user. When I worked on the Maya plugin over the winter break, I was generally pretty good at displaying error messages. But when the deadline started approaching and things got more complex, I unfortunately just resorted to printing out generic error messages I received. They weren't even helpful to me, let alone the users, but it was quicker and simpler to do so.

"User control and freedom" means that there is a clear way for the user to do things. The recent Global Game Jam game I made, Relativity, was made in 48 hours and as a result we didn't have a lot of time to polish it up. If you played the game in full screen mode, there is actually no way to exit the game. The player would have to alt-tab and manually close it from Windows. Obviously, having an exit game button would solve this.

Player control is all relative.

The fourth heuristic is "consistency and standards", meaning that everything should be consistent and intuitive. The very first game I made, Pew Pew, violates this one pretty hard. The entire game is played with the keyboard except for two situations, advancing to the next level and detonating a bomb. Oftentimes players that play it have no idea that they need to use the mouse at all and get stuck, which is a sign that I'm doing it wrong. I know better now of course.

The next heuristic is "error prevention", something that I definitely need to work on. A more generic case of when I dealt with this is when I had to make some sort of software that let users type in some sort of interactable text box. I've learned to do string checks for string fields and to limit number fields to only numbers because it's pretty surprising how badly a piece of software can break if an incorrect character was entered.

Would your system break if they enter the same character(s) you're using as your delimiter?

The sixth heuristic, "recognition rather recall", means that the user shouldn't need to remember information. Instead, they should be able to just recognize what they need to do. My very first GDW game, Portal Puffs, breaks this rule. You are allowed to save and load your game, but when you load your game you have to remember what your save file was called. Having a list would have been nice.

The lecture slides says that "flexibility and efficiency of use" allow users to tailor frequent actions and make it accessible. I don't think any of the projects I've worked on allows the user to do this. The closest thing I have to that is letting users turn mouse aim on and off in SHFT, which is a pretty big deal since a lot of people can't play it properly with it turned off. Having more customization is definitely something I want to do in future projects though.

Allowing you to shift into an easier mode by pressing M.

I'm currently working on a research project that deals with visualization. It's not done yet, but I think that it applies to "aesthetic and minimalist design", the next heuristic. This is because I'm designing a system that visualizes data. As a result, it should look nice and clean so that it can present the data that I need to present in a quick and intuitive way. It defeats the purpose of visualization if it looks bad after all.

The ninth heuristic is "help users recognize, diagnose and recover from errors". I honestly don't think I've encountered this one because I don't do as much error handling as I should, so when something goes wrong, the program just dies. As a result, I don't have the opportunity to help the user recognize and recover from their errors. I really need to work on that.

The aftermath of an error.

Finally, the last heuristic is "help and documentation". Going back to Relativity, when the player first loads up the game there is no information on how to play it. However, we have a tutorial level where the player is slowly introduced to the game's mechanics in a short little walkthrough. I think this works out pretty well since it lets the player immediately starting playing the game and guides them as they start.

Overall, I found that Nielsen's heuristics are pretty cool. They've given me a lot to think about, and going forward I'm going to try to keep them in mind when I make new games and software.