A very popular, straightforward, and traditional-style answer to this question, , given early on, was:
vector<string> split(const char *str, char c = ' ') {
std::vector<std::string> result;
do {
const char *begin = str;
while(*str != c && *str) { str++; }
result.push_back(std::string(begin, str));
} while (0 != *str++);
return result;
}
but a recent answer is:
auto results = str | ranges::views::tokenize(" ",1);
which is in lazy-evaluated functional style, and doesn't even directly utilize the fugly standard C++ string class. This example is of course a bit simplistic (since the main change exhibited here is in the standard library), but the point is that the language has demonstrasted strong abilities to reconfigure how users tend to write code. But - perhaps I'm giving it more credit than it's due any this won't comen to pass.
As an example, take this question from 2008:
"How do I tokenize a string in C++?" https://stackoverflow.com/q/53849/1593077
A very popular, straightforward, and traditional-style answer to this question, , given early on, was:
but a recent answer is: which is in lazy-evaluated functional style, and doesn't even directly utilize the fugly standard C++ string class. This example is of course a bit simplistic (since the main change exhibited here is in the standard library), but the point is that the language has demonstrasted strong abilities to reconfigure how users tend to write code. But - perhaps I'm giving it more credit than it's due any this won't comen to pass.