Pelles C forum

C language => Expert questions => Topic started by: czerny on January 26, 2015, 03:52:13 PM

Title: writing 12 Bits
Post by: czerny on January 26, 2015, 03:52:13 PM
I have to write 12.bit codes to a file (lzw-compression), like in gif-files. I found infos, that these 12 bits are written little endian.

But the code I found so fare are not handling this in that way. So I am unsure.

Say I have two 12-bit codes (with 3 nibbles each):

Code: [Select]
1: aaaa bbbb cccc
2: dddd eeee ffff

is this written as

Code: [Select]
bbbb cccc aaaa eeee ffff dddd
or as

Code: [Select]
aaaa bbbb cccc dddd eeee ffff
Title: Re: writing 12 Bits
Post by: frankie on January 26, 2015, 05:16:30 PM
Difficult to say not knowing what it is inteded for.
I.e. the assumption you made that the machine which read the code can consider nibbles means that 12bits is not the word size of that machine. The meaning of endianess on unknown siize machine is a big issue (it can have 6bits word in this case you have to save the first 6 bits than the next for little endian).
If the machine, or the code that use the data, is byte oriented maybe your first guess is the correct one...

If I well remember, from times where I wrote assembly to access floppies 12bits FAT, the trick was to read 3 bytes at time then mask the value to get the first 12 bits, than shift right by 12 to get the next 12 bits.
Title: Re: writing 12 Bits
Post by: czerny on January 26, 2015, 05:37:47 PM
The cpu is a 80x86 , nothing special. I have divided the 12-bit value in nibbles to explain the question.
Title: Re: writing 12 Bits
Post by: frankie on January 26, 2015, 05:50:02 PM
The 'machine' is intended as abstract data consumer entity, but maybe the technique used for 12bits FAT could apply...
Title: Re: writing 12 Bits
Post by: AlexN on January 28, 2015, 09:40:09 AM
perhaps help this;
Code: [Select]
#include <stdio.h>


typedef struct
{
    int val1:12;
    int val2:12;
} str12;

typedef union
{
    str12 str;
    unsigned char ch[3];
} un24;

un24 u;
int main(void)
{
    int i;
    u.str.val1 = 0xabc;
    u.str.val2 = 0xdef;
    for(i=0;i<3;i++)
    {
        printf("%02x ",u.ch[i]);
    }
    printf("\n");
}
;)
I think this is not dependent by the CPU (a 80x86 can not write 12 bits), but it depands by the compiler.
Title: Re: writing 12 Bits
Post by: czerny on January 28, 2015, 10:43:12 AM
The problem is not how to do it. There are many ways to do it.
The problem is how this is actually needed in the special lzw implementation of gif.
Title: Re: writing 12 Bits
Post by: frankie on January 28, 2015, 10:57:48 AM
take a look to the GifLib project (http://giflib.sourceforge.net/) and here (http://giflib.sourceforge.net/whatsinagif/lzw_image_data.html).
Title: Re: writing 12 Bits
Post by: czerny on January 28, 2015, 06:59:16 PM
Thank you! It was invisible for me. :(