I am trying to write the large data into a csv file and the number of rows I am dealing with is in Millions. My code is generating the rows in form of array of doubles. The code is taking vary long time to write data into file (double than writing the same number of records to database with 10k bulk insert on the same machine).
I have tried buffering the 10000 and 100000 rows in the std:: string.
void class::PrepareRow()
{
for (int i = 1; i <= m_nColumnCount; i++)
{
if (arrayOfRowVals[i] != NULL_NUMBER)
{
char buffer[50] = {};
sprintf(buffer, "%9.7lf", arrayOfRowVals[i]);
strBulkcsvString.append(buffer);
}
if (i < m_nColumnCount)
strBulkcsvString.append(",");
}
strBulkcsvString.append("\n");
m_rowcount++;
if (m_rowcount == 10000)
{
m_rowcount = 0;
strBulkcsvString.clear();
m_Csvfile.open("/install/FactPOC/Fact.csv", std::ios_base::out | std::ios_base::app);
m_Csvfile << strBulkcsvString.c_str();
strBulkcsvString.clear();
m_rowcount = 0;
m_Csvfile.close();
}
}
Aucun commentaire:
Enregistrer un commentaire