How to Convert from Lpctstr to Std::String

How do I convert from LPCTSTR to std::string?

Tip of the iceberg

LPCTSTR can be either a single-byte or a multibyte string (depends on the UNICODE constant being defined during compilation or not), while std::string's users (including your function) normally use it to hold a single-byte string.

You 'd need two conversions: one for LPCSTR (non-UNICODE build) and one for LPCWSTR (UNICODE build). The first one is simple:

std::string convert(LPCSTR str) {
return std::string(str);
}

The second one needs its input parameter to be converted to another encoding first with WideCharToMultiByte. Do not be alarmed by the name, the result can be a single-byte char string; that depends on the CodePage parameter. You will have to use a codepage for a single-byte encoding, such as CP_ACP.

Update: WideCharToMultiByte example

Be aware that accurately converting to a single-byte encoding is technically impossible if the input string contains characters not existing in the target encoding's code page. Since you mention it's going to be for filesystem functions, if the file path contains such characters the conversion will not be 100% accurate and the subsequent function calls will fail.

std::string MBFromW(LPCWSTR pwsz, UINT cp) {
int cch = WideCharToMultiByte(cp, 0, pwsz, -1, 0, 0, NULL, NULL);

char* psz = new char[cch];

WideCharToMultiByte(cp, 0, pwsz, -1, psz, cch, NULL, NULL);

std::string st(psz);
delete[] psz;

return st;
}

Caveat emptor: The example above is from some code I had lying around and is not production-grade quality. The one immediately obvious flaw is that it is not exception-safe. It might also kill all the nice purple unicorns. Use it only as an example.

The full encoding hell

The naked truth is that std::string can be used for multibyte encodings (such as UTF8) just fine -- you can even use it to hold wide-char strings, since it's just a binary-safe array of bytes at heart.

The problem is that the STL functions that apply to std::string expect its contents to be in a single-byte encoding, and they won't produce correct results if this is not true.

By extension, we don't know what your function that takes an std::string parameter expects -- it might expect a string encoded in UTF-8. But "by convention", I 'm assuming it also wants a single-byte-encoded string.

How to convert LPCTSTR to std::string in C++

LPCTSTR is an alias for const TCHAR*, where TCHAR is either char or wchar_t, depending on project configuration.

The cast of i_bstrCameraName to WCHAR* is wrong if your project is configured to map TCHAR to char. Just use the char data as-is instead.

And the conversion of wchar_t characters to char in the for loop is wrong if TCHAR is configured to map to wchar_t instead. That will lose data for non-ASCII characters. You would need to use WideCharToMultiByte() or equivalent to convert the wchar_t data to char properly.

You really should not be using TCHAR-based functionality in modern code at all. That has not been needed since the days when Microsoft was migrating users from ANSI-based Win9x/ME to Unicode-based WinNT+.

But, if you must, your code should look more like this instead:

std::string PlaybackStart(LPCTSTR i_bstrCameraName)
{
string fcid;

#ifdef UNICODE
int wlen = lstrlenW(i_bstrCameraName);
int len = WideCharToMultiByte(CP_ACP, 0, i_bstrCameraName, wlen, NULL, 0, NULL, NULL);
fcid.resize(len);
WideCharToMultiByte(CP_ACP, 0, i_bstrCameraName, wlen, &fcid[0], len, NULL, NULL);
#else
fcid = i_bstrCameraName;
#endif

std::string myURL = "SOME IP";

myURL += fcid;

return myURL;
}

That being said, the parameter name i_bstrCameraName suggests that the parameter should actually be declared as a BSTR, not an LPCTSTR. BSTR is an alias for OLECHAR* aka wchar_t*, eg:

std::string PlaybackStart(BSTR i_bstrCameraName)
{
string fcid;

int wlen = SysStringLen(i_bstrCameraName);
int len = WideCharToMultiByte(CP_ACP, 0, i_bstrCameraName, wlen, NULL, 0, NULL, NULL);
fcid.resize(len);
WideCharToMultiByte(CP_ACP, 0, i_bstrCameraName, wlen, &fcid[0], len, NULL, NULL);

std::string myURL = "SOME IP";

myURL += fcid;

return myURL;
}

How to convert std::string to LPCSTR?

str.c_str() gives you a const char *, which is an LPCSTR (Long Pointer to Constant STRing) -- means that it's a pointer to a 0 terminated string of characters. W means wide string (composed of wchar_t instead of char).

How to convert std::wstring to LPCTSTR in C++?

Simply use the c_str function of std::w/string.

See here:

http://www.cplusplus.com/reference/string/string/c_str/

std::wstring somePath(L"....\\bin\\javaw.exe");

if (!CreateProcess(somePath.c_str(),
cmdline, // Command line.
NULL, // Process handle not inheritable.
NULL, // Thread handle not inheritable.
0, // Set handle inheritance to FALSE.
CREATE_NO_WINDOW, // ON VISTA/WIN7, THIS CREATES NO WINDOW
NULL, // Use parent's environment block.
NULL, // Use parent's starting directory.
&si, // Pointer to STARTUPINFO structure.
&pi)) // Pointer to PROCESS_INFORMATION structure.
{
printf("CreateProcess failed\n");
return 0;
}

std::string to LPCTSTR

Your problem here is the fact that LPCTSTR is resolved to wchar_t* or char* based on whether your build supports unicode (unicode flag set or not).

To explicitly call the char* version, call CreateDirectoryA().



Related Topics



Leave a reply



Submit