The following code
#include <cstdio>
#include <cstdint>
#include <type_traits>
int main()
{
static const char* b2s[] = { "no", "yes" };
printf( "%s\n", b2s[std::is_same<unsigned long, uint64_t>::value] );
}
returns yes when compiled on Linux, and no when compiled on OSX.
I read while trying on understand why that this was apparently normal. How can I deal with this if I am developing a library and that I want it to be portable? Will I have to deal with each OS's preference?
Here is an example:
foo.h
#include <cstdint>
template <class T>
struct Foo
{
static const char id;
};
foo.cpp
#include "foo.h"
#define DEFINE_FOO(type_,id_) \
template <> const char Foo<type_>::id = id_; \
template <> const char Foo<const type_>::id = id_;
DEFINE_FOO( bool,'a')
DEFINE_FOO( int8_t,'b')
DEFINE_FOO( uint8_t,'c')
DEFINE_FOO( int16_t,'d')
DEFINE_FOO(uint16_t,'e')
DEFINE_FOO( int32_t,'f')
DEFINE_FOO(uint32_t,'g')
DEFINE_FOO( int64_t,'h')
DEFINE_FOO(uint64_t,'i')
DEFINE_FOO( float,'j')
DEFINE_FOO( double,'k')
// OSX requires this, but Linux considers it a re-definition
// DEFINE_FOO(unsigned long,'l')
As part of my library, the previous is compiled and then linked when I need to create an executable (say main.cpp). Typically, this looks like:
$(CC) -c -Wall -std=c++0x -o foo.o foo.cpp
$(CC) -Wall -std=c++0x -o main main.cpp foo.o
That will work on one platform but fail in the other; if I uncomment DEFINE_FOO(unsigned long,'l') in foo.cpp then OSX is happy, but Linux says Foo<uint64_t> is being redefined. And vice-versa.
How can that be normal? Is there a "portable" way to deal with this?
Aucun commentaire:
Enregistrer un commentaire