Sunday 24 July 2016

Lenk's Quest part II: Controlling the display with AVR

Earlier I managed to get the LCD1602 display with I2C module working with Raspberry Pi and Python (see part I and part II). Now the challenge is to repeat the same thing with ATtiny and C and take it a step further.

As a disclaimer, if you are interested to make a program on AVR using a display and following this (or any other) post as a tutorial, I have one advice: don't do it blindly. I think it is quite obvious that I'm not a pro at this (especially after this post) and while my technical posts are meant to work as guidelines, they are surely not tutorials. With that in mind, read critically and don't repeat my mistakes.

AVR has this so-called Universal Serial Interface (USI) which allows the serial communication on the hardware level. Compared to software implementations, the USI allows faster communication requiring less code. The USI has two modes; three- and two-wire mode. The two-wire mode (TWI) is identical with I2C but apparently owing to the copyright issues or something it can't be called as such.

My first plan was thus to use TWI instead of writing my own code. It's built-in so it's probably easy to use, right? Soon after I took out the datasheet, I came into a conclusion that I don't want to stress out my brain cells to figure that shit out. I'll probably regret my decision later but it was Sunday then after all.

I switched to the plan B that was to get out my Python code and translate it to C. Implementing a serial protocol in software instead of hardware is called bit banging by the way. Totally unrelated, it brings to my mind a certain Finnish word, 'pilkunnussija', that is used for a person who is unnecessarily keen to detail. Literally translating to 'comma fucker', it is somewhat synonymous to 'grammar Nazi', except with an extra serving of heat and lubrication.

Up to this point, I've been writing all my code into a single .c file but it is not a very good practice for a larger project like Lenk's Quest. To make the source code more manageable, I'll split it into multiple files. There are two parts in controlling the screen: I2C communication and the display specific commands. Thus it is natural make a separate file for I2C related functions and definitions that are used by the screen control functions in another file.

Before going through the source code, I want make a small detour. If one desires to use multiple .c files in their program, they have accompany them with header files as well. Header files (.h) contain macro definitions (i.e shorthands and mnemonics) and function declarations. But why are they needed?

Suppose you use the function eat_apple in your code that is defined in another file. Without including the appropriate header file, there would be no way for the compiler know what the function takes in and what it outputs. In C this kind of ignorance is not tolerated. The header file with the line double eat_apple(int); removes this problem. The .c file actually containing the definition of the function eat_apple is not needed at this point; in the extreme case it doesn't even have to exist yet! It is not required until it needs to be compiled to be linked together with all the other source code files, forming a single executable program. (More info about the headers and linking, see here and here.)

I2C Communication


As with my Python code, I need only unidirectional I2C communication: from the microcontroller to the screen. I want to keep it simple, but as in the future I might want recycle the code, I want it make it general purpose as well. (Always when programming, it is a good practice to try to keep the code general rather than specific. It makes reusing and modifying the code easier and helps to develop your thinking process.)

My header file i2c.h looks like this:

#ifndef I2C_H_
#define I2C_H_
#include <avr/io.h>
#include <util/delay.h>

#define SDA_PIN 4
#define SCL_PIN 3

void init_i2c();
void write_bytes(uint8_t address, uint8_t *bytes);

#endif

The first two lines together with the very last one form the header guard that prevents the file to be included multiple times (which would almost certainly happen when dealing with more than one file). Inside the guard I first include some AVR libraries that'll I need later on in the source code file i2c.c.

The next two lines define macros for the pins I'll be using for the communication. This sets PB4 as the data line (SDA) and PB3 as the clock line (SCL). Owing to the includes, I could have written this alternatively like this:

#define SDA_PIN PB4
#define SCL_PIN PB3

Then come the function declarations. The first one is the initialisation function init_i2c that sets the defined SDA_PIN and SCL_PIN as outputs and their states to HIGH (I2C bus ready state). The second one, write_bytes, sends a sequence of bytes to the given address. bytes is a pointer that is interpreted to be a null-terminated string: something I thought was a smart choice because that way I don't have to count the number of bytes and pass it to the function separately. It eventually turned out be, erm, not so smart after all but I'll get back to it later.

Let's move on to the actual source code in i2c.c. The beginning of the file looks like this:

#include "i2c.h"

void init_i2c(){

    //set SDA and SCL as output
    DDRB |= (1 << SDA_PIN)|(1 << SCL_PIN);

    //set both high
    PORTB |= (1 << SDA_PIN)|(1 << SCL_PIN);
    _delay_us(10);

}

Nothing unusual here. SDA_PIN and SCL_PIN dictate the positions of the pins and the bits in DDRB and PORTB are set accordingly.

As MCU is much faster toggling its pins than Raspberry Pi with Python, I've added some delays to make sure that the receiving party can keep up with the speed. They shouldn't be longer than necessary, as it slows down the communication speed. The ones you see here are surely not optimised; I just picked up a safe value and showed them up everywhere I could as I wanted avoid any bugs owing to too fast pin toggling. There are probably some kind of guidelines which I'll look up later on (or not, depending whether the current solution is fast enough. There's no need to optimize if the solution already meets the requirements).

The write_bytes function uses an auxiliary function called write_byte which writes a single byte to the I2C bus and reads and returns the acknowledge bit:

int write_byte(uint8_t byte){
    //write a single byte to the bus

    //clock down
    PORTB &= ~(1 << SCL_PIN);
    _delay_us(10);

    int i;
    for(i=7;i>-1;i--){
        //set SDA
        if(byte & (1 << i))
            PORTB |= (1 << SDA_PIN);
        else
            PORTB &= ~(1 << SDA_PIN);
        _delay_us(10);

        //clock high
        PORTB |= (1 << SCL_PIN);
        _delay_us(10);

        //clock down
        PORTB &= ~(1 << SCL_PIN);
        _delay_us(10);

    }
    //read ACK

    //set SDA as input
    DDRB &= ~(1 << SDA_PIN);
    PORTB |= (1 << SDA_PIN);
    _delay_us(10);

    //get ACK

    //clock high
    PORTB |= (1 << SCL_PIN);
    _delay_us(10);

    int ack = PINB & (1 << SDA_PIN);
    _delay_us(10);

    //clock low
    PORTB &= ~(1 << SCL_PIN);
    _delay_us(10);

    //set SDA as output
    DDRB |= (1 << SDA_PIN);
    PORTB |= (1 << SDA_PIN);
    _delay_us(10);
    
    return ack;
}

This is very similar to the insides of the loop in the write function I wrote in Python. The main difference is that instead of the list of ones and zeros, a byte is represented by an uint8_t variable. The write_bytes function utilising this is:

void write_bytes(uint8_t address, uint8_t *bytes){
    //Start condition (bus should be ready)
    PORTB &= ~(1 << SDA_PIN);
    _delay_us(10);

    DDRB |= 1;
    //write the address + read bit
    write_byte(address<<1);

    //write bytes
    int i;
    for(i=0; bytes[i] != '\0' ; i++)
        write_byte(bytes[i]);

    //Stop condition    
    _delay_us(10);

    PORTB &= ~(1 << SDA_PIN);
    _delay_us(10);
    PORTB |= (1 << SCL_PIN);
    _delay_us(10);
    PORTB |= (1 << SDA_PIN);
    _delay_us(10);

}

Here, after the start condition, the loop goes through the bytes in the array until the null terminator is encountered and the transmission is terminated with the stop condition.

Before I started to make the display control code upon write_bytes, I tested it out with a short code that turns the backlight off and on again, like with the Python code. Not surprisingly, the screen didn't respond at all.

To pinpoint the porblem [sic], I first wanted to know whether the I2C chip responds correctly to the sent data. Easiest way to do this is to read the acknoledge bit but, unlike with the Pi, I had no screen to print its state to. Thus I added a debug LED to my circuit that lit up according to the value returned by write_byte. It revealed that the signal wasn't getting through which was caused by an programming error in the clock signal (one HIGH state was accidentially LOW which is effectively a missing pulse).

But the screen still didn't turn off. This one was harder to spot because it wasn't just a random error but a design flaw. I tried setting the screen of by sending a zero byte to the I2C chip. Guess what 0 stands for as well? The null terminator. So my initially smart solution turned out to be crap as it doesn't allow sending a zero byte over I2C at all which is obviously a serious shortcoming. It doesn't prevent any of the display's features though, so go on with this and fix it sometime later.

Display control


As we found out earlier, the display has to be used in a 4-bit mode with the I2C chip. Commands are sent in two parts, each sent three times toggling the enable bit. To simplify the use of the display, we'll write a bunch of helpful functions to control it.

The header file display.h is written in the same manner as before:

#ifndef DISPLAY_H_
#define DISPLAY_H_

#define LCD_ADDRESS 0b0100111

#include "i2c.h"

void init_display();
void write_instruction(uint8_t rs, uint8_t rw, uint8_t cmd, uint8_t backlight);
void write_text(uint8_t *str);
void write_text_2line(uint8_t *str,uint8_t *str2);

#endif

The file again starts and ends with the guard. Then I define a macro for the address of display's I2C module and include the I2C code freshly out of the oven.

After the power-on, the screen has to be initialised according to the procedure described in its datasheet. With my code this is done followingly:

#include "display.h"

void init_display(){

    //init i2c bus
    init_i2c();

    //50 ms delay (required) 
    _delay_ms(50);

    //initcommand
    write_bytes(LCD_ADDRESS, "\x38\x3c\x38");
   
    //5 ms delay (required) 
    _delay_ms(5);
  
    //Init command repeated twice more according to datasheet 
    write_bytes(LCD_ADDRESS, "\x38\x3c\x38");
    write_bytes(LCD_ADDRESS, "\x38\x3c\x38");
   
    //Set 4-bit interface  
    write_bytes(LCD_ADDRESS, "\x28\x2c\x28");

}

The string literal "\x38\x3c\x38" is same as the array {0x38,0x3C,0x38,'\0'}. The function write_bytes addresses the device marked by LCD_ADDRESS, sends these bytes (0x38 with the enable bit LOW and 0x3C HIGH) to the I2C bus and stops on the null terminator ('\0' = 0x00). After setting the display to the 4-bit mode, the initialisation is complete and we can start sending instructions:

void write_instruction(uint8_t rs, uint8_t rw, uint8_t cmd, uint8_t backlight){
    uint8_t hbyte, lbyte, rwrs;

    rwrs = 0;
    if(backlight)
        rwrs |= 0x08;
    if(rs)
        rwrs |= 0x01;
    if(rw)
        rwrs |= 0x02;
    

    hbyte =  (0xf0 & cmd)|rwrs;
    lbyte =  (0xf0 & (cmd<<4))|rwrs;

    uint8_t byte_str[7] = {hbyte,hbyte|0x04,hbyte,lbyte,lbyte|0x04,lbyte,'\0'};

    write_bytes(LCD_ADDRESS, byte_str);

}

The input of the function is formatted so that it follows the format of the commands given in the datasheet. Thus rs and rw are merely truth values, whereas cmd is the data byte that is split into two 4-bit parts.

And finally we've reach the the point we've been seeking for: writing text! In the Python test program I sent all the characters as arrays of bits which was very cumbersome. Since the charcter table of the display is compatible with ASCII codes, we can simply make a plain text string and send character by character. Non-ASCII and custom characters can be used with \x as earlier.

void write_text(uint8_t *str){

    int i;
    for(i=0;str[i] != '\0';i++)
        write_instruction(1, 0, str[i], 1);

}

void write_text_2line(uint8_t *str,uint8_t *str2){
    //DDRAM address 1st line
    write_instruction(0, 0, 0x80, 1);
    write_text(str);
    //DDRAM address 2nd line
    write_instruction(0, 0, 0xC0, 1);
    write_text(str2);

}

The first function is a bit stupid. It just starts writing wherever the current address is pointing at. The second one, on the other hand, first sets the Display Data RAM (DDRAM) address to the beginnings of the rows before writing.

Test Program


Here's a small program to test it out:

#define F_CPU 1000000UL 
#include <util/delay.h>

#include "display.h"


void main(){

    init_display();

    write_instruction(0, 0, 0x28, 1); //Function set (4-bit, 2-line, 5x8 dots)  
    write_instruction(0, 0, 0x08, 1); //Display Off, cursor off, blinking off 
    write_instruction(0, 0, 0x01, 1); //Clear display  
    write_instruction(0, 0, 0x06, 1); //Entry mode (left to right, don't shift display)

  
    write_instruction(0,0,0x0C,1); //Display on     

    write_text_2line("<LENK'S QUEST|=O","Graphics test   ");
 
    while(1){};
    
}

The program starts naturally with the display's initialisation function. It is followed by a bunch of settings after which it is set on and ready to be written. And here's the result:


Despite all the deficiencies of the code, it works like a charm! But to be honest, that ASCII sword surrounding the title looks a bit crude. I think graphics need a bit of seasoning..

No comments:

Post a Comment