Sunday 24 July 2016

Lenk's Quest part II: Controlling the display with AVR

Earlier I managed to get the LCD1602 display with I2C module working with Raspberry Pi and Python (see part I and part II). Now the challenge is to repeat the same thing with ATtiny and C and take it a step further.

As a disclaimer, if you are interested to make a program on AVR using a display and following this (or any other) post as a tutorial, I have one advice: don't do it blindly. I think it is quite obvious that I'm not a pro at this (especially after this post) and while my technical posts are meant to work as guidelines, they are surely not tutorials. With that in mind, read critically and don't repeat my mistakes.

AVR has this so-called Universal Serial Interface (USI) which allows the serial communication on the hardware level. Compared to software implementations, the USI allows faster communication requiring less code. The USI has two modes; three- and two-wire mode. The two-wire mode (TWI) is identical with I2C but apparently owing to the copyright issues or something it can't be called as such.

My first plan was thus to use TWI instead of writing my own code. It's built-in so it's probably easy to use, right? Soon after I took out the datasheet, I came into a conclusion that I don't want to stress out my brain cells to figure that shit out. I'll probably regret my decision later but it was Sunday then after all.

I switched to the plan B that was to get out my Python code and translate it to C. Implementing a serial protocol in software instead of hardware is called bit banging by the way. Totally unrelated, it brings to my mind a certain Finnish word, 'pilkunnussija', that is used for a person who is unnecessarily keen to detail. Literally translating to 'comma fucker', it is somewhat synonymous to 'grammar Nazi', except with an extra serving of heat and lubrication.

Up to this point, I've been writing all my code into a single .c file but it is not a very good practice for a larger project like Lenk's Quest. To make the source code more manageable, I'll split it into multiple files. There are two parts in controlling the screen: I2C communication and the display specific commands. Thus it is natural make a separate file for I2C related functions and definitions that are used by the screen control functions in another file.

Before going through the source code, I want make a small detour. If one desires to use multiple .c files in their program, they have accompany them with header files as well. Header files (.h) contain macro definitions (i.e shorthands and mnemonics) and function declarations. But why are they needed?

Suppose you use the function eat_apple in your code that is defined in another file. Without including the appropriate header file, there would be no way for the compiler know what the function takes in and what it outputs. In C this kind of ignorance is not tolerated. The header file with the line double eat_apple(int); removes this problem. The .c file actually containing the definition of the function eat_apple is not needed at this point; in the extreme case it doesn't even have to exist yet! It is not required until it needs to be compiled to be linked together with all the other source code files, forming a single executable program. (More info about the headers and linking, see here and here.)

I2C Communication


As with my Python code, I need only unidirectional I2C communication: from the microcontroller to the screen. I want to keep it simple, but as in the future I might want recycle the code, I want it make it general purpose as well. (Always when programming, it is a good practice to try to keep the code general rather than specific. It makes reusing and modifying the code easier and helps to develop your thinking process.)

My header file i2c.h looks like this:

#ifndef I2C_H_
#define I2C_H_
#include <avr/io.h>
#include <util/delay.h>

#define SDA_PIN 4
#define SCL_PIN 3

void init_i2c();
void write_bytes(uint8_t address, uint8_t *bytes);

#endif

The first two lines together with the very last one form the header guard that prevents the file to be included multiple times (which would almost certainly happen when dealing with more than one file). Inside the guard I first include some AVR libraries that'll I need later on in the source code file i2c.c.

The next two lines define macros for the pins I'll be using for the communication. This sets PB4 as the data line (SDA) and PB3 as the clock line (SCL). Owing to the includes, I could have written this alternatively like this:

#define SDA_PIN PB4
#define SCL_PIN PB3

Then come the function declarations. The first one is the initialisation function init_i2c that sets the defined SDA_PIN and SCL_PIN as outputs and their states to HIGH (I2C bus ready state). The second one, write_bytes, sends a sequence of bytes to the given address. bytes is a pointer that is interpreted to be a null-terminated string: something I thought was a smart choice because that way I don't have to count the number of bytes and pass it to the function separately. It eventually turned out be, erm, not so smart after all but I'll get back to it later.

Let's move on to the actual source code in i2c.c. The beginning of the file looks like this:

#include "i2c.h"

void init_i2c(){

    //set SDA and SCL as output
    DDRB |= (1 << SDA_PIN)|(1 << SCL_PIN);

    //set both high
    PORTB |= (1 << SDA_PIN)|(1 << SCL_PIN);
    _delay_us(10);

}

Nothing unusual here. SDA_PIN and SCL_PIN dictate the positions of the pins and the bits in DDRB and PORTB are set accordingly.

As MCU is much faster toggling its pins than Raspberry Pi with Python, I've added some delays to make sure that the receiving party can keep up with the speed. They shouldn't be longer than necessary, as it slows down the communication speed. The ones you see here are surely not optimised; I just picked up a safe value and showed them up everywhere I could as I wanted avoid any bugs owing to too fast pin toggling. There are probably some kind of guidelines which I'll look up later on (or not, depending whether the current solution is fast enough. There's no need to optimize if the solution already meets the requirements).

The write_bytes function uses an auxiliary function called write_byte which writes a single byte to the I2C bus and reads and returns the acknowledge bit:

int write_byte(uint8_t byte){
    //write a single byte to the bus

    //clock down
    PORTB &= ~(1 << SCL_PIN);
    _delay_us(10);

    int i;
    for(i=7;i>-1;i--){
        //set SDA
        if(byte & (1 << i))
            PORTB |= (1 << SDA_PIN);
        else
            PORTB &= ~(1 << SDA_PIN);
        _delay_us(10);

        //clock high
        PORTB |= (1 << SCL_PIN);
        _delay_us(10);

        //clock down
        PORTB &= ~(1 << SCL_PIN);
        _delay_us(10);

    }
    //read ACK

    //set SDA as input
    DDRB &= ~(1 << SDA_PIN);
    PORTB |= (1 << SDA_PIN);
    _delay_us(10);

    //get ACK

    //clock high
    PORTB |= (1 << SCL_PIN);
    _delay_us(10);

    int ack = PINB & (1 << SDA_PIN);
    _delay_us(10);

    //clock low
    PORTB &= ~(1 << SCL_PIN);
    _delay_us(10);

    //set SDA as output
    DDRB |= (1 << SDA_PIN);
    PORTB |= (1 << SDA_PIN);
    _delay_us(10);
    
    return ack;
}

This is very similar to the insides of the loop in the write function I wrote in Python. The main difference is that instead of the list of ones and zeros, a byte is represented by an uint8_t variable. The write_bytes function utilising this is:

void write_bytes(uint8_t address, uint8_t *bytes){
    //Start condition (bus should be ready)
    PORTB &= ~(1 << SDA_PIN);
    _delay_us(10);

    DDRB |= 1;
    //write the address + read bit
    write_byte(address<<1);

    //write bytes
    int i;
    for(i=0; bytes[i] != '\0' ; i++)
        write_byte(bytes[i]);

    //Stop condition    
    _delay_us(10);

    PORTB &= ~(1 << SDA_PIN);
    _delay_us(10);
    PORTB |= (1 << SCL_PIN);
    _delay_us(10);
    PORTB |= (1 << SDA_PIN);
    _delay_us(10);

}

Here, after the start condition, the loop goes through the bytes in the array until the null terminator is encountered and the transmission is terminated with the stop condition.

Before I started to make the display control code upon write_bytes, I tested it out with a short code that turns the backlight off and on again, like with the Python code. Not surprisingly, the screen didn't respond at all.

To pinpoint the porblem [sic], I first wanted to know whether the I2C chip responds correctly to the sent data. Easiest way to do this is to read the acknoledge bit but, unlike with the Pi, I had no screen to print its state to. Thus I added a debug LED to my circuit that lit up according to the value returned by write_byte. It revealed that the signal wasn't getting through which was caused by an programming error in the clock signal (one HIGH state was accidentially LOW which is effectively a missing pulse).

But the screen still didn't turn off. This one was harder to spot because it wasn't just a random error but a design flaw. I tried setting the screen of by sending a zero byte to the I2C chip. Guess what 0 stands for as well? The null terminator. So my initially smart solution turned out to be crap as it doesn't allow sending a zero byte over I2C at all which is obviously a serious shortcoming. It doesn't prevent any of the display's features though, so go on with this and fix it sometime later.

Display control


As we found out earlier, the display has to be used in a 4-bit mode with the I2C chip. Commands are sent in two parts, each sent three times toggling the enable bit. To simplify the use of the display, we'll write a bunch of helpful functions to control it.

The header file display.h is written in the same manner as before:

#ifndef DISPLAY_H_
#define DISPLAY_H_

#define LCD_ADDRESS 0b0100111

#include "i2c.h"

void init_display();
void write_instruction(uint8_t rs, uint8_t rw, uint8_t cmd, uint8_t backlight);
void write_text(uint8_t *str);
void write_text_2line(uint8_t *str,uint8_t *str2);

#endif

The file again starts and ends with the guard. Then I define a macro for the address of display's I2C module and include the I2C code freshly out of the oven.

After the power-on, the screen has to be initialised according to the procedure described in its datasheet. With my code this is done followingly:

#include "display.h"

void init_display(){

    //init i2c bus
    init_i2c();

    //50 ms delay (required) 
    _delay_ms(50);

    //initcommand
    write_bytes(LCD_ADDRESS, "\x38\x3c\x38");
   
    //5 ms delay (required) 
    _delay_ms(5);
  
    //Init command repeated twice more according to datasheet 
    write_bytes(LCD_ADDRESS, "\x38\x3c\x38");
    write_bytes(LCD_ADDRESS, "\x38\x3c\x38");
   
    //Set 4-bit interface  
    write_bytes(LCD_ADDRESS, "\x28\x2c\x28");

}

The string literal "\x38\x3c\x38" is same as the array {0x38,0x3C,0x38,'\0'}. The function write_bytes addresses the device marked by LCD_ADDRESS, sends these bytes (0x38 with the enable bit LOW and 0x3C HIGH) to the I2C bus and stops on the null terminator ('\0' = 0x00). After setting the display to the 4-bit mode, the initialisation is complete and we can start sending instructions:

void write_instruction(uint8_t rs, uint8_t rw, uint8_t cmd, uint8_t backlight){
    uint8_t hbyte, lbyte, rwrs;

    rwrs = 0;
    if(backlight)
        rwrs |= 0x08;
    if(rs)
        rwrs |= 0x01;
    if(rw)
        rwrs |= 0x02;
    

    hbyte =  (0xf0 & cmd)|rwrs;
    lbyte =  (0xf0 & (cmd<<4))|rwrs;

    uint8_t byte_str[7] = {hbyte,hbyte|0x04,hbyte,lbyte,lbyte|0x04,lbyte,'\0'};

    write_bytes(LCD_ADDRESS, byte_str);

}

The input of the function is formatted so that it follows the format of the commands given in the datasheet. Thus rs and rw are merely truth values, whereas cmd is the data byte that is split into two 4-bit parts.

And finally we've reach the the point we've been seeking for: writing text! In the Python test program I sent all the characters as arrays of bits which was very cumbersome. Since the charcter table of the display is compatible with ASCII codes, we can simply make a plain text string and send character by character. Non-ASCII and custom characters can be used with \x as earlier.

void write_text(uint8_t *str){

    int i;
    for(i=0;str[i] != '\0';i++)
        write_instruction(1, 0, str[i], 1);

}

void write_text_2line(uint8_t *str,uint8_t *str2){
    //DDRAM address 1st line
    write_instruction(0, 0, 0x80, 1);
    write_text(str);
    //DDRAM address 2nd line
    write_instruction(0, 0, 0xC0, 1);
    write_text(str2);

}

The first function is a bit stupid. It just starts writing wherever the current address is pointing at. The second one, on the other hand, first sets the Display Data RAM (DDRAM) address to the beginnings of the rows before writing.

Test Program


Here's a small program to test it out:

#define F_CPU 1000000UL 
#include <util/delay.h>

#include "display.h"


void main(){

    init_display();

    write_instruction(0, 0, 0x28, 1); //Function set (4-bit, 2-line, 5x8 dots)  
    write_instruction(0, 0, 0x08, 1); //Display Off, cursor off, blinking off 
    write_instruction(0, 0, 0x01, 1); //Clear display  
    write_instruction(0, 0, 0x06, 1); //Entry mode (left to right, don't shift display)

  
    write_instruction(0,0,0x0C,1); //Display on     

    write_text_2line("<LENK'S QUEST|=O","Graphics test   ");
 
    while(1){};
    
}

The program starts naturally with the display's initialisation function. It is followed by a bunch of settings after which it is set on and ready to be written. And here's the result:


Despite all the deficiencies of the code, it works like a charm! But to be honest, that ASCII sword surrounding the title looks a bit crude. I think graphics need a bit of seasoning..

Friday 15 July 2016

Lenk's Quest part I: Game design

Not all those who wander are lost 
J. R. R. Tolkien

A good adventure doesn't require a map but at times one can be quite handy. Especially in the case where one is building something complicated out of components that are not. In order decrease the likelihood of finding Lenk's Quest deep in a swamp populated by spirits of never finished projects, I present thee the Document of Design.

Project's roadmap, artist's interpretation

Hardware limitations


Programming microcontrollers differs somewhat from programming a personal computer. With computers the question I repeatedly ask myself is how can I do this? Whatever I want to do they have the juice for it.

Microcontrollers, on the other hand, are not steam engines that can push through everything. They are more like low-profile Segways that get stuck on the first pebble they come across. More appropriate question is what can be done? The programmer needs to turn their brain into the Haiku-mode and try to come up with creative solutions to squeeze the most out of the scarce resources and stringent limitations. I find this a refreshing challenge.

So, what do we have? First of all, I want to use my 16x2 character display (LCD1602) which needs two pins for I2C communication. If I had similar I2C I/O expander chips like the display has, I could connect as many buttons as I wanted (8 per chip) and still use the same two wires as the display does. But I don't, neither do I have any other sensible mean of serial communication.

Thus using ATtiny85 I have 3 free I/O pins for buttons, one per each. Not too bad. I only get half a D-pad but considering that my screen has only two rows of text, I think I can cope with that.

What about sound effects and music? I have a few ideas but I haven't tested them yet. There are some technical complications that I'll go through in a separate post. Getting an earworm infection from a suitable tune has suffice for this project.

Program flow


The program structure is similar as in The Torment of Alfred McSilvernuts. In the main program I'll first initialise everything, like configuring the pin directions and setting up the counter. Then the execution moves on to the infinite loop that does nothing.

The loop is interrupted by a timer that overflows approximately every 16.7 ms, aiming for the frame rate of 60 per second. All the action happens in the Interrupt Service Routine where the chip reads whether the buttons are pressed or not, executes the next of the game logic, and draws stuff on the display.


To make the game more interesting, I want to have more than just a one level. I also want that my game has a title screen that pops up at the start-up, and an ending screen that tells the player whether they have won or lost the game. I think the best way to do this is to break the game logic section into separate pieces for different states, like this:


What comes to computer games, I believe it's a common practice to separate the graphics from the game logic. However, with my hardware updating the whole screen takes a relatively long time that makes the display flicker. Not a good thing for the game play experience.

Fortunately I don't have to update the whole display every frame as normally only small portions of the screen change. I wouldn't want to mix the graphics into the logic but, for now, it seems to be the smartest solution to update the part of the screen immediately after the corresponding part of the game logic (e.g. character moves a step) is computed. I'm not completely happy with it but this way I don't have to think how to store and pass the information to the separate update function.

Game mechanics, objective and dungeon design


As a Legend of Zelda ripoff inspired piece, the 'camera' will be viewing the game area from the bird's perspective. Since I have only 3 buttons and two rows of text on the screen, it is best to limit the rooms (or levels) to a single dimension: the horizontal one. This way I need only two buttons, left and right, for the moving. The third button can be then used for different kind of actions, like attacking with the sword. I'll call it, not very surprisingly, the action button or the A button for short.

As the movement is limited on the one line, the other one on the screen can be used to present the wall of the room. There are some positions on the room where pressing A will make the player switch the room instead of attacking. These spots are indicated to the player by placing a door on the wall right next to them.

To make the game a game, we need to add some challenge. Let's say that the objective of the player is to escape the dungeon. This can be done through a locked door. The player has to get the key but it is guarded by a ferocious werewolf who tries the kill the player. To get the key, the player needs to slay it first.

Putting all these ideas on the paper, the dungeon turned out to look like this:


I decided to settle for two rooms. This way the middle row (one with the open door) is visible all the time: on the top row of the screen when in the lower room and on the bottom row when in the upper room. This gives the player a sense of the shape of the dungeon. If the wall was always on the same row, the player would have hard time to get a grasp of the geometry as the camera would appear to turn 180 degrees around every time they pass through a door. This solution is not limited only to two rooms. I could add an arbitrary number of rooms like this:



I just don't have any real content to be put into them. And nothing's more frustrating in video games than doing something totally worthless. It's like taking The Hobbit and stretch it into three movies.

The rooms are short so they fit into a single screen. This way I don't have to implement camera scrolling. It would not be very complicated, but certainly more than my game needs.

You can see that the door is positioned in the middle of the screen, two tiles away from the player's start position. This way the player can test and get used to controls before entering the second room. There's also a suitable distance between the door and the werewolf, so the player won't get their ass kicked immediately at the doorstep.

The last thing I want to mention about the dungeon design, is the bottom left corner of the screen. You may have noticed that the lower room is a bit shorter than the upper one. That's because I need a place for the health counter. It is nice to have all the counters at the same position on the screen at all times, so I made a part of the level impassable, so the player can't step on the counter.

---

Now, with the roadmap of objectives established, we're ready get our hands dirty. I think the best way to continue is to get started with graphics. After all, the easiest way to make sure that the things are working is to see them in action.

Monday 4 July 2016

Obscure systems and a Zelda rip-off

New Legend of Zelda is coming out next year. As a long term fan, I'm so excited that I already went and bought myself a Wii U. It is my actually my first console since I moved out on my own. As a matter of fact, I've been more interested in making games than playing them in the recent years. But Breath of the Wild is something I don't want to miss. Look at all this vast, natural world, puzzles utilizing the physics engine, and ancient technology!

Yeah, I'm drooling (figuratively). I don't know whether I should have waited for the new console, Nintendo NX, but considering that nobody does really have a clue about it, I don't think my purchase was a bad one.

Wii U has a bunch of nice looking titles, like extremely well received Super Mario 3D World of which some say it is the best Mario game ever. It is also backward compatible with Wii and has Virtual console for the old classics. Apparently it also runs Gamecube games after a bit of tweaking but doing so puts you on Santa's Naughty list...right? Probably voids the warranty as well.

On top of that Wii U has sold rather badly, so in a few years it is going to be really hard to find. Every now and then I get excited about some specific retro system. For example, 8 years ago I read about Game Boy Micro and I was like Oo, a smaller backlit version of GBA! If I get an SD card reader for that, it would be so handy to carry around and play all the best Game Boy games whenever I feel like it (smartphones weren't so smart back then). I've got to have one!

Blue Game Boy Micro and GBA cartridge [source]
Never heard of it? No wonder, as it sold only 2.42 million units worldwide. For comparison Game Boy Advance has sold 35.52 million units and this doesn't include SP. I failed to find it from any game stores, first or second hand, so I eventually bought it off eBay. I personally like it more than Advance or SP. It is smaller but the screen is better looking and it feels good in hand. It doesn't support Game Boy and Game Boy Color games (unless one uses an SD card reader, naughty naughty), which might have contributed to its failure together with the release of DS. It's a shame but this time I'm prepared if the same thing were to happen to Wii U.

But even if they'd outlaw Wii U, Breath of the Wild won't be out until the next year. I don't think I can wait that long. That's why I'm going to put my recently acquired electronics skillz to use and make the next Zelda adventure myself...except I don't dare to call it a 'Zelda'. There are rather high expectations of the fans that my little project will never meet. But even more importantly, Nintendo happens to be very protective of their electronic rights and don't take violations kindly (I had to allow ads on my last video if I wanted to use a tune from Super Mario Bros 2).

Following the great success of The Torment of Alfred McSilvernuts, I hereby announce my next adventure game:

It would surely look like this if I had more pixels and, like, colors.

(Although it might appear like one, it is totally not a fart joke here. I'm a Master of Science after all; my sense of humor is delicate and mature as fuck.)

Unlike the last game with a bunch of LEDs lighting up, this one requires a bit more thought. Just making it is going to be an adventure on its own. I hope the difficulty setting is no higher than medium.