string - c++ large file performance with char* and pitfalls -
i'm writing , manipulating massive text files (180+mb). , i'm trying come memory schema handle data.
bigfile.txt \r delimited list of strings.
my current idea preform:
std::string str; std::ifstream in2("bigfile.txt", std::ios::binary); if (in2.is_open()) { in2.seekg(0, std::ios::end); str.reserve(in2.tellg()); in2.seekg(0, std::ios::beg); str.assign((std::istreambuf_iterator<char>(in2)), std::istreambuf_iterator<char>()); } for(size_t = 0; < str.size();i++) { if (str[i] == '\r') { str[i] = 0; } }
then go through list , map vector of char* string.
is there downfalls of design?
and long don't delete char*'s shouldn't leak memory right? cause string loose scope , clear correct?
Comments
Post a Comment