Question

Why is my bot unable to handle a CSV file that has 42,000 rows and 12 columns?

  • 18 May 2022
  • 3 replies
  • 104 views

Currently I'm building a bot that handles large amounts of data. My bot ends up timing out when it runs the 'CSV/TXT: Read' action. When I read a similar file with 20,000 rows and 12 columns it is able to read it within a minute.

 

Is the problem I'm having due to the limitations of the data table variable? Or is the problem have to do with the CSV file. If it is the CSV file, does anyone know a good workaround to being able to read such a huge file?

 

Thank you in advance!


3 replies

You can use read CSV as database method. Drivers for the same are available online . Just install and connect. Then you can use the CSV data by select query and then use for each row in a dataset column. Also applicable as Excel as database. ​

Badge +1

You can use read CSV as database method. Drivers for the same are available online . Just install and connect. Then you can use the CSV data by select query and then use for each row in a dataset column. Also applicable as Excel as database. ​

Do we have documentation available for the same, I have tried searching for it but seems like few are removed since they are outdated.

Userlevel 3
Badge +7

Using Excel as a Database for a CSV file source doesn’t seem efficient to me. It would be better to devise a scheme of importing the CSV into an actual database then run the SQL against that database. Access .mdb or .accdb would be more efficient.

My reasoning is CSV is a text file, and you’re actively changing the data in the file with every query, and the latency in accessing the data stream is going to be excessive and inconsistent.

Also with a CSV Read of a large amount of data items brings that data into memory, depending on your device’s configuration you might overload the memory with a huge data table.

Reply