According to the answer to this question std::wstring can either be u16string or u32string.
According to the first answer to this question one can simply convert to u16string and get std::wstring as a result.
What I wonder is: how do I know if I have 16 or 32-bit representation? If I want to convert UTF8 to std::wstring, It looks like I can't use the solution given because I don't know what the run-time will be.
So, now how do I convert it properly? Or this is not relevant and the conversion will always succeed in that case independently if I have 16-bit or 32-bit representation without ever losing anything?
Can someone please clarify?
Thank you.
Aucun commentaire:
Enregistrer un commentaire