Expand|Select|Wrap|Line Numbers
- void Object::TokenizeLines(const string& str, vector<string>& tokens, const string& delimiters)
- // Skip delimiters at beginning.
- string::size_type lastPos = str.find_first_not_of(delimiters, 0);
- // Find first "non-delimiter".
- string::size_type pos = str.find_first_of(delimiters, lastPos);
- while (string::npos != pos || string::npos != lastPos) {
- // Found a token, add it to the vector.
- tokens.push_back( CheckWord( str, lastPos, pos - lastPos));
- // Skip delimiters. Note the "not_of"
- lastPos = str.find_first_not_of(delimiters, pos);
- // Find next "non-delimiter"
- pos = str.find_first_of(delimiters, lastPos);
- }
- }
with a function call like this : string.Tokenize(str, tokens, "\n");
and because in each line I had things like : abc,de,,,f,g,,h
the previous lines were faulty. So I found this one :
Expand|Select|Wrap|Line Numbers
- void TokenizeWithComma(const string& str, vector<string>& tokens){
- const char* first = str.c_str();
- const char* last = str.c_str() + strlen(str.c_str());
- while (first != last) {
- const char* next = find(first, last, ',');
- tokens.push_back(string(first, next - first));
- first = min(next + 1, last);
- }
- }
After spenting too many hours today for that, I have my homework done, but I am not sure what I did here...
Can someone (not novice likeme) give me more detailed view?
Also :
2. Why, the first function don't want to work with strings like a,bc,,d ?
3. I tryed also this for each word ( which are in vector ) :
Expand|Select|Wrap|Line Numbers
- for (w_iter = token_lines.begin(); w_iter != token_lines.end(); w_iter++) {
- string ff = (*w_iter);
- string::size_type loc = ff.find( "abc", 0 );
- if( loc != string::npos ) { cout << "Found Omega at " << loc << endl;}
- else {cout << "Didn't find Omega" << endl;}
This drove me nuts, and made me in order to find patterns, devide a string with substr and have also cases for not making illegal measurements in the substr function ( which is of string again ).
I know it might be borring topic for most, I appreciate your help...
Thank you