Forums › Cached › CSV Import
This topic contains 10 replies, has 0 voices, and was last updated by Daniel Giguere 10 years, 10 months ago.
-
AuthorPosts
-
January 13, 2014 at 8:50 am #24324
ClaytonmI have an inventory feed csv file that gets emailed to me daily from a supplier that is 400,000 line items long.
I believe netsuite can only take 25,000 line items at a time, so the file must get broken up into about 16 files.
Is there any faster way to import these files than just doing 16 different csv imports and letting them take as long as they take?
I was told that it could be done through scripting, however, there is some limitation of 300 records per 15 minutes which would end up not being any faster than a regular csv import.
Has anyone found any solutions for speeding this process up? Is there any kind of XML feed somethingorother solution?
This is a cached copy. Click here to see the original post. -
January 13, 2014 at 10:29 am #24325
Olivier Gagnon NCRE: Importing Large Files
How big (in Mb) is the file?
-
January 13, 2014 at 10:56 am #24326
ClaytonmRE: Importing Large Files
Originally posted by Olivier Gagnon
View Post
How big (in Mb) is the file?
the biggest one is about 39MB, but if you break them down into 16 smaller files for 25000 rows each for csv import, each file ends up being 2-3MB
-
January 13, 2014 at 11:24 am #24327
kastnerdDepends on what your updating. For pricing updated. We created temporary "future price" price levels. Then it takes a few days to import all the pricing into the future price. But then when its time to go live we use the swap price levels function and we get the new pricing live quickly.
-
January 13, 2014 at 11:33 am #24328
ClaytonmRE: Importing Large Files
Originally posted by kastnerd
View Post
Depends on what your updating. For pricing updated. We created temporary "future price" price levels. Then it takes a few days to import all the pricing into the future price. But then when its time to go live we use the swap price levels function and we get the new pricing live quickly.
The File is a daily inventory feed from a supplier that updates a custom field in my item records called "manufacturer inventory"
-
January 13, 2014 at 3:55 pm #24329
Olivier Gagnon NCRE: Importing Large Files
Well a scripted solution would have a 5 MB file size limit, so that doesn't look like a good avenue. Doing an integration using either Web Services or RESTlets might be an option, slowly eating up the file, but this requires development cooperation with the Vendor.
-
January 14, 2014 at 8:10 am #24330
ClaytonmRE: Importing Large Files
Originally posted by Olivier Gagnon
View Post
Well a scripted solution would have a 5 MB file size limit, so that doesn't look like a good avenue. Doing an integration using either Web Services or RESTlets might be an option, slowly eating up the file, but this requires development cooperation with the Vendor.
what type of cooperation? you mean cooperation further than supplying me with the file? what would a webservices solution take?
-
January 14, 2014 at 9:13 am #24331
Olivier Gagnon NCRE: Importing Large Files
Well, I guess it depends a lot. I don't know how they supply the file. Let's say they place the file on an FTP server – well, you could have a Web Services process monitor that FTP and as soon as the file is there, the WS process starts feeding it into NS.
However, if there isn't a way for you to ping and check if the latest file is there, you would need the vendor to tell you somehow that you can get the file.
This could go all the way to the vendor triggering the process for you, pushing the data into an awaiting RESTlet or Web Services endpoint. This would require development work on the Vendor's side, though.
So, ideally, you're able to "explore" the vendor's file structure and "discover" the file is available yourself, then you don't need your vendor to change anything on their side. This is mostly a Web Services solution though, I don't think you could do this kind of architecture using RESTlets.
-
January 14, 2014 at 11:14 am #24332
ClaytonmRE: Importing Large Files
Originally posted by Olivier Gagnon
View Post
Well, I guess it depends a lot. I don't know how they supply the file. Let's say they place the file on an FTP server – well, you could have a Web Services process monitor that FTP and as soon as the file is there, the WS process starts feeding it into NS.
However, if there isn't a way for you to ping and check if the latest file is there, you would need the vendor to tell you somehow that you can get the file.
This could go all the way to the vendor triggering the process for you, pushing the data into an awaiting RESTlet or Web Services endpoint. This would require development work on the Vendor's side, though.
So, ideally, you're able to "explore" the vendor's file structure and "discover" the file is available yourself, then you don't need your vendor to change anything on their side. This is mostly a Web Services solution though, I don't think you could do this kind of architecture using RESTlets.
The large file is hosted on their FTP server that i have login credentials for. I login to the ftp site, and download the file to my computer. Then I have been breaking it apart into smaller files and importing them into netsuite.
The file automatically gets overwritten daily with the same file name, to the same download link at the same time. So whether a web services accesses the file, or i download it from the ftp and upload it to the file cabinet…i can get netsuite to the file. But my question is, how fast could webservices get the data into the system, and would it be faster than importing 16 csv files with 25000 line items each
-
January 14, 2014 at 6:35 pm #24333
Daniel GiguereFor a specific benchmark performed last year, in that specific case SuiteTalk was 1.8 times faster that CSV import. Results may differ! You may check against your own data by inserting let's say 1000 records by web services (10 addList operations each containing 100 records) and compare the timings.
The advantage of CSV import is the automatic queuing done by NetSuite. You send the 16 files and NetSuite handles the rest. Using SuiteTalk, the application needs to be developed, but in many cases it may worth it.
-
January 14, 2014 at 6:55 pm #24334
Daniel GiguereI should have mentioned that the approach selected for this specific integration following the benchmark was to use RESTlets with 10 concurrent connections, each sending 5 records at a time (configurable). The application we developed reads the data from a database, automatically converts it to JSON and sends it to a NetSuite RESTlet, 10 workers pushing data in parallel. Finally, each individual response are written back to the database on the corresponding lines.
-
AuthorPosts
You must be logged in to reply to this topic.