vendredi 24 avril 2015

Fitting string literals for different string classes

The problem

I am implementing a class where I want to let the user choose the string type (std::string, std::wstring, std::u16string, ...) via a template parameter. I currently fail to make the string literals fit the chosen string type: Once I decide for a literal prefix ("hello" vs. L"hello" vs. u"hello" vs. U"hello"), I get compilation errors for all incompatible string classes.

Toy example

As an example, consider the following code (compile with --std=c++11):

#include <string>

template<typename StringType>
void hello_string()
{
    StringType result("hello");
}

int main()
{
    // works
    hello_string<std::string>();
    hello_string<std::basic_string<char>>();

    // the code below does not compile
    hello_string<std::wstring>();
    hello_string<std::basic_string<unsigned char>>();
    hello_string<std::u16string>();
}

Function hello_string() shows the essence of what I want to do: have a string type as template parameter, and assign string literals to variables of this type.

Possible workaround

One way to overcome my problem would be to implement several specializations of the hello_string() function. The problem is that this would lead to several copies of each string literal - one for each string literal prefix. I think this is rather ugly, and there must be better way.

Another way could be to chose "normal" string literals as default values and have functions do a conversion to the different string types. While this would avoid code duplication, it would introduce unnecessary conversions of something that is actually constant.

Aucun commentaire:

Enregistrer un commentaire