Hi Joseph, Many thanks for your suggestion. Although I don't like such hardcoded assumptions, I implemented the workaround you proposed and the app seems to do what I want either under windows when compiled with Visual C++ 2005 pro, or under linux when compiled with GCC 4.3.3. Here is the final code: ------ BEGINNING OF CODE ------ /** * * This small programs tests and explains endianness. * */ /* --- The needed header files --- */ #include <stdio.h> /* for input and output */ #include <stdlib.h> /* calloc() */ #include <malloc.h> /* calloc() */ /* --- Given definition --- */ #ifdef WIN32 /* windows-like OS assumed */ #define ATTRIBUTE #define uint8_t unsigned char #define __LITTLE_ENDIAN 1234 #define __BIG_ENDIAN 4321 #define __PDP_ENDIAN 3412 #define __BYTE_ORDER __LITTLE_ENDIAN #else /* linux-like OS assumed */ #include <inttypes.h> /* special integer types */ #include <endian.h> /* defines appropriate macros to test endianness. */ #define ATTRIBUTE __attribute__((packed)) #endif /* windows or linux */ /* --- The byte divided into 8 bit fields --- */ typedef struct _bitfields { #if __BYTE_ORDER == __LITTLE_ENDIAN uint8_t b7:1; uint8_t b6:1; uint8_t b5:1; uint8_t b4:1; uint8_t b3:1; uint8_t b2:1; uint8_t b1:1; uint8_t b0:1; #elif __BYTE_ORDER == __BIG_ENDIAN uint8_t b0:1; uint8_t b1:1; uint8_t b2:1; uint8_t b3:1; uint8_t b4:1; uint8_t b5:1; uint8_t b6:1; uint8_t b7:1; #endif /* big or little endian */ } ATTRIBUTE bitfields; /* --- main --- */ int main(void) { uint8_t *val=(uint8_t*)calloc(1,1); bitfields* bf = (bitfields*)val; printf("Tests endianness.\n"); #if 1 /* double check code */ printf("Little endian = %d, ",__LITTLE_ENDIAN); printf("big endian = %d, ",__BIG_ENDIAN); printf("and byte order = %d.\n",__BYTE_ORDER); #endif /* end of double check code. */ #if __BYTE_ORDER == __LITTLE_ENDIAN printf("This machine works in little endian.\n"); #elif __BYTE_ORDER == __BIG_ENDIAN printf("This machine works in big endian.\n"); #endif /* big or little endian */ #if 1 /* Another double check code */ printf("Size of value is %d ",sizeof(*val)); printf("and size of bit fields is %d.\n",sizeof(*bf)); #endif /* End of the second double check code */ printf("\nTo understand how bits are ordered into a byte,\n"); printf("Give a value : "); #ifdef _WIN32 /* scanf() is deprecated by VS 2005. */ scanf_s("%hhu", val); #else /* no problem with this under linux */ scanf("%hhu", val); #endif /* windows or linux */ while ( *val ) { uint8_t n = bf->b0&1; printf("Given value = %d.\n", *val); printf("val=%d,",n); n = bf->b1&1; printf("%d,",n); n = bf->b2&1; printf("%d,",n); n = bf->b3&1; printf("%d,",n); n = bf->b4&1; printf("%d,",n); n = bf->b5&1; printf("%d,",n); n = bf->b6&1; printf("%d,",n); n = bf->b7&1; printf("%d from b0 to b7.\n",n); printf("Give another value : "); #ifdef _WIN32 /* scanf() is deprecated by VS 2005. */ scanf_s("%hhu", val); #else /* no problem with this under linux */ scanf("%hhu", val); #endif /* windows or linux */ } /* at this time, value is 0 */ printf("Finished.\n"); free(val); return 0; } /* main */ ------ END OF CODE ------ Here is the output which is now printed by this piece of code: ------ BEGINNING OF OUTPUT ------ Tests endianness. Little endian = 1234, big endian = 4321, and byte order = 1234. This machine works in little endian. Size of value is 1 and size of bit fields is 1. To understand how bits are ordered into a byte, Give a value : 1 Given value = 1. val=0,0,0,0,0,0,0,1 from b0 to b7. Give another value : 128 Given value = 128. val=1,0,0,0,0,0,0,0 from b0 to b7. Give another value : 0 Finished. ------ END OF OUTPUT ------ So, now, as you can see, the bits of an octet are always shown beginning with the MSB as I wanted. However, a strange problem remains under windows. Just after printing "Finished!", the debugger shows a dialog box stating that the heap was corrupted during execution. Here is the message contained in the dialog box: ------ BEGINNING OF MESSAGE ------ Microsoft Visual C++ Debug Library Debug Error!graphic 904 Program: c:\local\testendian\debug\TestEndian.exe HEAP CORRUPTION DETECTED: after Normal block (#58) at 0x00362E90. CRT detected that the application wrote to memory after end of heap buffer. (Press Retry to debug the application) ------ END OF MESSAGE ------ The faulty instruction seems to be "free(val)" and I really can't figure out why. Of course, in this particular case, since the "free(val)" instruction is at the right end of the code, I could simply omit it. If I comment the "free(val)" line, my code smoothly ends without error message. In this particular case, the generated memory leak is not critical since the memory which is lost by the code itself will be collected soon by the OS. However, this is not the first time I notice this annoying behaviour. When such a thing occurs in the middle of a program execution, windows does not always complain and continues to run silently until the heap is totally corrupted at a later time. So please, what did I do wrong with this piece of code? I suspect the problem to come from the fact that my bitfields data structure exactly points to the input value because when I remove it and the associated code which prints the bits one by one, the heap is no more corrupted but how can I avoid this please? Many thanks in advance. Have a nice day. Chris D -----Original Message----- From: programmingblind-bounce@xxxxxxxxxxxxx [mailto:programmingblind-bounce@xxxxxxxxxxxxx] On Behalf Of Joseph Lee Sent: mardi 20 juillet 2010 14:54 To: programmingblind@xxxxxxxxxxxxx Subject: RE: How should I test endianness in C or C++? Hi, Try using quotes instead of arrows. But first, you might want to include it as part of your project. Cheers, Joseph -----Original Message----- From: programmingblind-bounce@xxxxxxxxxxxxx [mailto:programmingblind-bounce@xxxxxxxxxxxxx] On Behalf Of Delaunay Christophe Sent: Tuesday, July 20, 2010 5:44 AM To: programmingblind@xxxxxxxxxxxxx Subject: RE: How should I test endianness in C or C++? Hi Martin and all, Many thanks for your help. In fact, what you noticed is exactly what I want to do. More precisely, I want to print the bits of a number always beginning with the most significant bit and ending with the least significant bit. In other words, I want to always print the bits of an octet as if it was represented in big endian. This is the reason why, when my code notices that it should work in big endian, the bitfields data structure defines bit b0 first, and when the code notices that it works in little endian, it defines the bitfields data structure with bit b7 first. In practice, this artefact is useful for code which deals with network protocols implemented in different platforms. The needed data structures are defined in some special .h file and then, it is manipulated by a code which is totally independent from endianness. Now, it works under linux. I posted my request on another more linux-dedicated list and some kind people told me that I forgot the following line: #include <endian.h> I added the line in my code and, under linux, it now seems to work like a charm. However, I also would greatly need this piece of code to work under windows. Under this OS, I'm compiling with Visual C++ pro 2005. The compiler complains that <endian.h> does not exist. What should I replace "#include <endian.h>" with under windows please? Many thanks. Have a nice day. Chris D -----Original Message----- From: programmingblind-bounce@xxxxxxxxxxxxx [mailto:programmingblind-bounce@xxxxxxxxxxxxx] On Behalf Of Martin Slack Sent: mardi 20 juillet 2010 13:19 To: programmingblind@xxxxxxxxxxxxx Subject: Re: How should I test endianness in C or C++? Hi Christophe, Although your two structs are ordered differently, the print routine takes the bits in order b0 to b7 whatever the endianness of the machine. Corrections welcome, Martin ----- Original Message ----- From: "Delaunay Christophe" <christophe.delaunay@xxxxxxxxxxxxxxx> To: <programmingblind@xxxxxxxxxxxxx> Sent: Tuesday, July 20, 2010 8:28 AM Subject: How should I test endianness in C or C++? Hi all, Here is my piece of code: ------ BEGINNING OF CODE ------ /** * * This small program tests and explains endianness. * */ /* --- The needed header files --- */ #include <stdio.h> /* for input and output */ #include <inttypes.h> /* special integer types */ /* --- Given definition --- */ #ifdef WIN32 /* windows-like OS assumed */ #define ATTRIBUTE #else /* linux-like OS assumed */ #define ATTRIBUTE __attribute__((packed)) #endif /* windows or linux */ /* --- The byte divided into 8 bit fields --- */ typedef struct _bitfields { #if __BYTE_ORDER == __BIG_ENDIAN uint8_t b0:1; uint8_t b1:1; uint8_t b2:1; uint8_t b3:1; uint8_t b4:1; uint8_t b5:1; uint8_t b6:1; uint8_t b7:1; #elif __BYTE_ORDER == __LITTLE_ENDIAN uint8_t b7:1; uint8_t b6:1; uint8_t b5:1; uint8_t b4:1; uint8_t b3:1; uint8_t b2:1; uint8_t b1:1; uint8_t b0:1; #endif /* big or little endian */ } ATTRIBUTE bitfields; /* --- main --- */ int main(void) { uint8_t val=0; bitfields* bf = (bitfields*)(&val); #if __BYTE_ORDER == __BIG_ENDIAN printf("This machine works in big endian.\n"); #elif __BYTE_ORDER == __LITTLE_ENDIAN printf("This machine works in little endian.\n"); #endif /* big or little endian */ printf("To understand how bits are ordered into a byte.\n"); printf("Give a value : "); scanf("%hhu", &val); while ( val != 0 ) { uint8_t n = bf->b0&1; printf("Given value = %d.\n", val); printf("val=%d,",n); n = bf->b1&1; printf("%d,",n); n = bf->b2&1; printf("%d,",n); n = bf->b3&1; printf("%d,",n); n = bf->b4&1; printf("%d,",n); n = bf->b5&1; printf("%d,",n); n = bf->b6&1; printf("%d,",n); n = bf->b7&1; printf("%d from b0 to b7.\n",n); printf("Give another value : "); scanf("%hhu", &val); } /* at this time, value is 0 */ printf("Finished.\n"); return 0; } /* main */ ------ END OF CODE ------ Problem: When I run this program, here is what I obtain: ------ BEGINNING OF OUTPUT ------ [delaunayc@rennxlxrda013 TestEndian]$ ./TestByteOrder This machine works in big endian. To understand how bits are ordered into a byte. Give a value : 1 Given value = 1. val=1,0,0,0,0,0,0,0 from b0 to b7. Give another value : 2 Given value = 2. val=0,1,0,0,0,0,0,0 from b0 to b7. Give another value : 3 Given value = 3. val=1,1,0,0,0,0,0,0 from b0 to b7. Give another value : 4 Given value = 4. val=0,0,1,0,0,0,0,0 from b0 to b7. Give another value : 128 Given value = 128. val=0,0,0,0,0,0,0,1 from b0 to b7. Give another value : 130 Given value = 130. val=0,1,0,0,0,0,0,1 from b0 to b7. Give another value : 131 Given value = 131. val=1,1,0,0,0,0,0,1 from b0 to b7. Give another value : 0 Finished. [delaunayc@rennxlxrda013 TestEndian]$ ------ END OF OUTPUT ------ As you can see, this machine says that it works in big endian but presents the byte order in little endian. Therefore, I suspect my #if test to be bad but what did I do wrong please? The machine is a PC running Fedora 12. I compiled the above code with GCC version 4.4.3. Many thanks in advance. Have a nice day. Chris D __________ View the list's information and change your settings at //www.freelists.org/list/programmingblind