Hi Martin and all, Many thanks for your help. In fact, what you noticed is exactly what I want to do. More precisely, I want to print the bits of a number always beginning with the most significant bit and ending with the least significant bit. In other words, I want to always print the bits of an octet as if it was represented in big endian. This is the reason why, when my code notices that it should work in big endian, the bitfields data structure defines bit b0 first, and when the code notices that it works in little endian, it defines the bitfields data structure with bit b7 first. In practice, this artefact is useful for code which deals with network protocols implemented in different platforms. The needed data structures are defined in some special .h file and then, it is manipulated by a code which is totally independent from endianness. Now, it works under linux. I posted my request on another more linux-dedicated list and some kind people told me that I forgot the following line: #include <endian.h> I added the line in my code and, under linux, it now seems to work like a charm. However, I also would greatly need this piece of code to work under windows. Under this OS, I'm compiling with Visual C++ pro 2005. The compiler complains that <endian.h> does not exist. What should I replace "#include <endian.h>" with under windows please? Many thanks. Have a nice day. Chris D -----Original Message----- From: programmingblind-bounce@xxxxxxxxxxxxx [mailto:programmingblind-bounce@xxxxxxxxxxxxx] On Behalf Of Martin Slack Sent: mardi 20 juillet 2010 13:19 To: programmingblind@xxxxxxxxxxxxx Subject: Re: How should I test endianness in C or C++? Hi Christophe, Although your two structs are ordered differently, the print routine takes the bits in order b0 to b7 whatever the endianness of the machine. Corrections welcome, Martin ----- Original Message ----- From: "Delaunay Christophe" <christophe.delaunay@xxxxxxxxxxxxxxx> To: <programmingblind@xxxxxxxxxxxxx> Sent: Tuesday, July 20, 2010 8:28 AM Subject: How should I test endianness in C or C++? Hi all, Here is my piece of code: ------ BEGINNING OF CODE ------ /** * * This small program tests and explains endianness. * */ /* --- The needed header files --- */ #include <stdio.h> /* for input and output */ #include <inttypes.h> /* special integer types */ /* --- Given definition --- */ #ifdef WIN32 /* windows-like OS assumed */ #define ATTRIBUTE #else /* linux-like OS assumed */ #define ATTRIBUTE __attribute__((packed)) #endif /* windows or linux */ /* --- The byte divided into 8 bit fields --- */ typedef struct _bitfields { #if __BYTE_ORDER == __BIG_ENDIAN uint8_t b0:1; uint8_t b1:1; uint8_t b2:1; uint8_t b3:1; uint8_t b4:1; uint8_t b5:1; uint8_t b6:1; uint8_t b7:1; #elif __BYTE_ORDER == __LITTLE_ENDIAN uint8_t b7:1; uint8_t b6:1; uint8_t b5:1; uint8_t b4:1; uint8_t b3:1; uint8_t b2:1; uint8_t b1:1; uint8_t b0:1; #endif /* big or little endian */ } ATTRIBUTE bitfields; /* --- main --- */ int main(void) { uint8_t val=0; bitfields* bf = (bitfields*)(&val); #if __BYTE_ORDER == __BIG_ENDIAN printf("This machine works in big endian.\n"); #elif __BYTE_ORDER == __LITTLE_ENDIAN printf("This machine works in little endian.\n"); #endif /* big or little endian */ printf("To understand how bits are ordered into a byte.\n"); printf("Give a value : "); scanf("%hhu", &val); while ( val != 0 ) { uint8_t n = bf->b0&1; printf("Given value = %d.\n", val); printf("val=%d,",n); n = bf->b1&1; printf("%d,",n); n = bf->b2&1; printf("%d,",n); n = bf->b3&1; printf("%d,",n); n = bf->b4&1; printf("%d,",n); n = bf->b5&1; printf("%d,",n); n = bf->b6&1; printf("%d,",n); n = bf->b7&1; printf("%d from b0 to b7.\n",n); printf("Give another value : "); scanf("%hhu", &val); } /* at this time, value is 0 */ printf("Finished.\n"); return 0; } /* main */ ------ END OF CODE ------ Problem: When I run this program, here is what I obtain: ------ BEGINNING OF OUTPUT ------ [delaunayc@rennxlxrda013 TestEndian]$ ./TestByteOrder This machine works in big endian. To understand how bits are ordered into a byte. Give a value : 1 Given value = 1. val=1,0,0,0,0,0,0,0 from b0 to b7. Give another value : 2 Given value = 2. val=0,1,0,0,0,0,0,0 from b0 to b7. Give another value : 3 Given value = 3. val=1,1,0,0,0,0,0,0 from b0 to b7. Give another value : 4 Given value = 4. val=0,0,1,0,0,0,0,0 from b0 to b7. Give another value : 128 Given value = 128. val=0,0,0,0,0,0,0,1 from b0 to b7. Give another value : 130 Given value = 130. val=0,1,0,0,0,0,0,1 from b0 to b7. Give another value : 131 Given value = 131. val=1,1,0,0,0,0,0,1 from b0 to b7. Give another value : 0 Finished. [delaunayc@rennxlxrda013 TestEndian]$ ------ END OF OUTPUT ------ As you can see, this machine says that it works in big endian but presents the byte order in little endian. Therefore, I suspect my #if test to be bad but what did I do wrong please? The machine is a PC running Fedora 12. I compiled the above code with GCC version 4.4.3. Many thanks in advance. Have a nice day. Chris D __________ View the list's information and change your settings at //www.freelists.org/list/programmingblind