This topic contains 2 replies, has 0 voices, and was last updated by Olivier Gagnon NC 7 years ago.

  • Author
    Posts
  • #21435

    Jordan Manningham

    I frequently have to update thousands of records with values for new fields/new features. I do so with a saved search and a scheduled script that I leave in testing and execute on demand. I constantly get the error that I'm out of governance for that execution and will have to manually execute the script again. Is there a way to catch the governance before it hits the limit and continue the script w/o error? I could leave the script to run every 15 minutes but I'd rather not if possible.
    This is a cached copy. Click here to see the original post.

  • #21436

    pcutler

    Absolutely, there are a number of approaches. One approach has your script processing the results of a saved search. As records are processed, they no longer meet the criteria to be included in the saved search. After each record is processed, you can run the following code:

    Code:
    if(nlapiGetContext().getRemainingUsage() < 1000) {
    // reschedule the script to run immediately with 10,000 new governance points
    nlapiScheduleScript(nlapiGetContext().getScriptId(), nlapiGetContext().getDeploymentId());

    return; // exit the script
    }

  • #21437

    Olivier Gagnon NC

    What Phil said. Also, a Map/Reduce script excels in this type of "process-each-results-of-a-saved-search" and takes care of governance for you, so you never have to worry about that. So, might be worth considering rewriting your script as a Map/Reduce as well.

You must be logged in to reply to this topic.