This topic contains 10 replies, has 0 voices, and was last updated by Voltron 11 years, 1 month ago.

  • Author
    Posts
  • #24199

    Aaron@MIT

    I have a saved CSV import that I use to import batches of records at a time. Whenever I import a batch of records, each record in the batch is associated with a single parent record (each batch has a different parent, but all records in a batch have the same parent).

    I need to update the parent after the CSV import, but only after the import completes. It does not need to complete 100%, but I don't want to schedule a script that might run while more records are still being imported.

    Suggestions? Thanks.
    This is a cached copy. Click here to see the original post.

  • #24200

    khultquist

    You could script the csv import:

    create a folder for incoming csv files

    script the import

    script removes the file from folder

    script parent update

  • #24201

    Voltron

    I like khultquist's idea.

    Another idea, perhaps a bit hacky, is to modify your files so that you have 1 column indicating a batch number (some type of ID) and another column that indicates a row as being the last. The script doesn't run until it gets a record with the "last row" flag, then it targets all records with the batch ID that haven't been processed (in case you re-use a batch number). This method assumes you're not multi-threading the import.

    If you want to multi-thread still or just have something more robust, insert a column that lists the total number of records on each row. This is kind of dirty because you'd have to run a script on each file that searches for that batch and checks the total number of records that match that batch number and doesn't do anything else until the number of results matches the number you put in that last column.

    With a little more thought, I could probably come up with something else a little slicker. If you can go with khultquist's method, do.

  • #24202

    Aaron@MIT

    Thanks. I will try khultquist's method. However, this brings up another question. I am trying to build this system so that the people that use it have to modify their files as minimally as possible. Ideally, the user would create the parent record, then click a button that opens a custom form for them to select and schedule the import. As such, they shouldn't have to modify their csv to relate it to the parent record. Instead, Netsuite would know of this relationship because the user clicked the button on the parent record in order to initiate the import. Is this possible, or would this require the CSV to be edited to related its records to the parent?

  • #24203

    amccausland

    No, the button just would need to send the script the parent ID somehow. I have done this by using Formula defaults on an Inline HTML field, where the parameter parts of the URL to the suitelet used curly brace field references.

  • #24204

    amccausland

    Of course also a button could just be on a client script function that does nlapiGetFieldValue() on the parent ID and activates the rest of your process using that value.

  • #24205

    smurphy820

    I would do the following.

    1. Add a hidden check box to the customer form (this is used to select the proper parentID)

    2. Add a button 'Mark as Import parent' (or something) with a script to

    A. unmark all other customer records as import parent

    B. Mark this record as import parent

    3. Add a scheduled script that looks for an existing CSV file in a folder in the file cabinet.

    If it finds a file then,

    look for a customer record marked as import parent.

    unmark it to (prevent the process finding it again on a subsequent run)

    IF both are found Load the file into a file object and run a find and replace for parent ID 999999 and replace it with the customer ID of the parent you found in the previous step

    Start the CVS import

    Delete the file from the file cabinet when the process completes.

    That should work with small modifications for your process, for this to work you will have to add a 'parent' column to the template your users are using with all rows equal to 9999999. The users however will not have to edit that at all.

    I have done a process similar to this before with success.

    Questions? Ask!

  • #24206

    Aaron@MIT

    RE: Run script after CSV import?

    Wow, thanks everyone for the suggestions. Using these, I was able to successfully script the import and assign the records to the correct parent.

    There is, however, one issue remaining. My whole goal for this was to update the parent once the import is done importing. However, the best that this can do (as far as I can see), is to schedule the import, then continue running the script that made the schedule. As such, the rest of my script would likely run before the import even started! Blegh! How can I set up my scripts so that the parent update is only run after the import is completed? Does Netsuite even give the ability to "delay script until import is completed"? Perhaps my best bet is to not use Netsuite's built in import functionality and instead use nlapiCreateRecord for every row that needs to be imported. At least then I would know that the rows were created before the parent update ran.

  • #24207

    smurphy820

    RE: Run script after CSV import?

    you can use nlobjJobManager.getFuture() to get a reference to an jobManager Object that you can check for a status on the import job so you can put your script into a loop to wait for the job to end like this.

    Code:

    jobId = manager.submit(mergeJobRequest);
    // Check the job status
    var future = manager.getFuture(jobId);

    //This section will cause your code to wait until the import finishes
    var canContinue = false;
    do {
    canContinue = future.isDone();
    }
    while (canContinue == false)

    //now continue your code
    The issues with this approach is you need to keep a list of IDs that were imported or put some kind of marker on the records to know what records to update. Also if the import takes a long time you could get a connection timeout.

    I would look at your process and code to see if, you re-work it a little, is there really is a reason you HAVE to update the parent after import.

    you could always put in a user event script with an after submit function to post process the record after it is created and set the context of that to only CSV Import. You would still be handling each record separately though .

  • #24208

    Aaron@MIT

    RE: Run script after CSV import?

    Ooh, Sean. That code looks enticing. Do you know what the duration for the timeout is? If it is like nlapiRequestURL, then the 45 second timeout might not cut it – imports sometimes don't even start for a couple minutes.

    Your other suggestions bring up new questions as well. Do you know if there is any way to cancel a scheduled script? If so, I could easily put an after-submit on my child records that schedules an update to the parent. Then, when lots of children are imported, each would schedule it, then the next could cancel that schedule and schedule its own. The end result would be that only the last item imported would have its schedule "stick".

    I could also just perform this scheduling anyways. The problem with this is that sometimes we must import 80-100 records, and I would need a lot of deployments to prevent them from all being "in queue" when the last item gets imported. Not an ideal solution.

  • #24209

    Voltron

    RE: Run script after CSV import?

    After seeing some of the nuances of these other solutions, I think I prefer one of of my own offerings.

    Originally posted by jonke

    View Post

    …snip…

    If you want to multi-thread still or just have something more robust, insert a column that lists the total number of records on each row. This is kind of dirty because you'd have to run a script on each file that searches for that batch and checks the total number of records that match that batch number and doesn't do anything else until the number of results matches the number you put in that last column

    …snip…

    What would make this easier is to hard-code in the mapping the batch number and the number of rows so you don't have to modify the file.

    Otherwise, I still haven't thought of a way to improve this. I still don't like the idea of having to update the batch # and rows each time. What if you mess that up? Need a variation on it.

You must be logged in to reply to this topic.