Im currently trying to declare an array of 17 std::bitsets, each 32 bits long. I'm doing it like this:
std::bitset<32> mTestInstruction[17]
{
std::string("01000000001000000000000000000001"),
std::string("01000000011000000000000001100011"),
std::string("01000000101000000000000000000001"),
std::string("10100000000000000000000000001010"),
std::string("00000000100000010000000010000010"),
std::string("00000000110001010010000000000001"),
std::string("01001000111001010000000000000000"),
std::string("01000100001000110000000000000011"),
std::string("01000000001000010000000000000001"),
std::string("10000000000000000000000000000011"),
std::string("00000000010000000000000000000001"),
std::string("00000000111000000000000000000001"),
std::string("00000000111001110000100000000001"),
std::string("01000000010000100000000000000001"),
std::string("01000100001000100000000000000010"),
std::string("10000000000000000000000000001100"),
std::string("11100000000000000000000000001000"),
};
And I'm receiving the following error:
error: could not convert 'std::__cxx11::basic_string<char>(((const char*)"01000000001000000000000000000001"), std::allocator<char>())' from 'std::__cxx11::string {aka std::__cxx11::basic_string<char>}' to 'std::bitset<32u>'
for each of the strings of bits.
I don't understand why this is happening because according to the cpp reference a std::string is a valid way of constructing a bitset. Could anyone point out how to fix this issue please?
Aucun commentaire:
Enregistrer un commentaire