Skip to main content
Question

Why is my bot unable to handle a CSV file that has 42,000 rows and 12 columns?


Currently I'm building a bot that handles large amounts of data. My bot ends up timing out when it runs the 'CSV/TXT: Read' action. When I read a similar file with 20,000 rows and 12 columns it is able to read it within a minute.

 

Is the problem I'm having due to the limitations of the data table variable? Or is the problem have to do with the CSV file. If it is the CSV file, does anyone know a good workaround to being able to read such a huge file?

 

Thank you in advance!

6 replies

  • Cadet | Tier 2
  • 1 reply
  • May 20, 2022

You can use read CSV as database method. Drivers for the same are available online . Just install and connect. Then you can use the CSV data by select query and then use for each row in a dataset column. Also applicable as Excel as database. ​


Forum|alt.badge.img+1
  • Cadet | Tier 2
  • 6 replies
  • July 24, 2024
Janani.R4 wrote:

You can use read CSV as database method. Drivers for the same are available online . Just install and connect. Then you can use the CSV data by select query and then use for each row in a dataset column. Also applicable as Excel as database. ​

Do we have documentation available for the same, I have tried searching for it but seems like few are removed since they are outdated.


jon.stueveapeople
Automation Anywhere Team
Forum|alt.badge.img+7

Using Excel as a Database for a CSV file source doesn’t seem efficient to me. It would be better to devise a scheme of importing the CSV into an actual database then run the SQL against that database. Access .mdb or .accdb would be more efficient.

My reasoning is CSV is a text file, and you’re actively changing the data in the file with every query, and the latency in accessing the data stream is going to be excessive and inconsistent.

Also with a CSV Read of a large amount of data items brings that data into memory, depending on your device’s configuration you might overload the memory with a huge data table.


Forum|alt.badge.img
  • Cadet | Tier 2
  • 1 reply
  • July 27, 2024

There could be several reasons why your bot is struggling with a CSV file of that size. Ensure that the data in the CSV file is clean and conforms to the expected formats. Charter spectrum internet Corrupt or malformed data can cause issues during processing.


Forum|alt.badge.img+8
  • Flight Specialist | Tier 4
  • 63 replies
  • August 16, 2024

We have seen this as well and solved by using a different format for the csv when it’s read into the data table. It’s NOT a row limitation. It has to do with the character set in the data. ASCII is a 7-bit character set containing 128 characters. ANSI is also limited... So try using UTF-8. That solved this issue for us :-)


Forum|alt.badge.img+3
  • Navigator | Tier 3
  • 13 replies
  • August 23, 2024

If I may be so bold, I like to use Sqlite as a file system in cases like these. I’ll open a powershell window, start sqlite, use its “.mode csv” and “.import” to import data into a table (it uses the header line to label table columns), and then connect to the database just created. Query and manipulate the data using SQL. It allows you to do very complex kinds of groupings that would be tedious to handle with A360 tasks. Here is the simplest of exampes:
 

 


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings