Hi all,
I don't now if this is the right place to this post but here we go...
Look at the little program below:
int main( int argc, char *argv[ ] ) {
unsigned char arr[ ] = "the brown fox...";
// sets size to 17 !!! ???
unsigned int size = sizeof( arr ) / sizeof( unsigned char );
// sets size to 16 !!! ???
size = strlen( ( const char* )arr );
return 0;
}
As seem in the coments, whem I use sizeof I get "size = 17" and
when I use strlen I get "size = 16"...
I this correct and, If it is, why? :?
Cheers
When you use sizeof() in this case the nul character, that terminates the string, will also be counted - strlen() will not count it.
Pelle
Quote from: "Pelle"When you use sizeof() in this case the nul character, that terminates the string, will also be counted - strlen() will not count it.
Pelle
Yeah!! Y're rigth, I don't know wat I was thinking when I post this message. :oops:
Thanks and sorry.