This topic contains 14 replies, has 0 voices, and was last updated by al3xicon 8 years, 8 months ago.

  • Author
    Posts
  • #8885

    al3xicon

    Hey all –

    Getting close to choosing BI software. Just wondering who else out there is connecting NetSuite to BI software, especially interested if you’re using the ODBC connector. What is your workflow? Where is your data being stored? Have you run into any issues? How was your ETL process?

    There’s been little commenting on this topic before, on related posts I’ve seen. I’m hoping I’m not the only one doing this right now.

    Beers,

    Alex
    This is a cached copy. Click here to see the original post.

  • #8886

    trimblee

    Hi Alex,

    I can speak to connecting NetSuite to Tableau, which isn’t fun. There’s no native Tableau connector for NetSuite, so you’re left with 1) ODBC, 2) data warehouse, 3) RESTlet endpoints.

    I’ve tried 1) and 2), and am considering testing out 3). Only obvious challenge I foresee with 3) is whether Tableau’s web data connector can paginate results, enabling retrieval of larger data sets (haven’t tested and even read any documentation yet).

    Happy to connect and share more info if you’d like, but high-level:

    Current state: we have a handful of Tableau NetSuite data sources setup as ODBC extracts. These can be a time consuming to set-up as Tableau’s GUI for for creating a join in a NetSuite ODBC extract doesn’t work, leaving you to hand code the (easy but boring) SQL to join List Names, etc. We used to have a full dump of NetSuite data into a warehouse (via Sesame Software’s “Relational Junction”), but have turned that off given SOX risks re managing access to NetSuite data.

    Future state:

    If we decide we can live with only operational data (i.e. no time-series), I’m hopeful that RESTlets and Tableau’s Web Data Connector will work. If we test it out and find it can handle large data sets, then we’ll set-up something self-service where any user with the ability to make a saved search can publish that search as an endpoint that can then be consumed in Tableau.

    If we decide we need something like a true data warehouse, then….I have a bigger project on my hands, we may build ourselves, likely either RedShift or Azure SQL, or may consider a 3rd party vendor (e.g. Snowflake Computing), depending on what we decide we’re trying to do.

    Good luck with your project, and don’t hesitate to reach out. I’d love to hear about what you go with and how it goes.

    Cheers,

    Evan

    p.s. I sometimes wish we’d stuck with GoodData, which takes care of a lot of the data warehouse type work, but adoption never took off and we canceled.

  • #8887

    al3xicon

    trimblee – Thanks so much for the detailed response.

    We ended up going with Sisense. I evaluated 11 total different software options. Tableau was one of the finalists, but since we were using the ODBC connector and Tableau doesn’t really support that connector, I decided against it. The visualization side of Tableau is definitely the most powerful, but for our size, a full-stack solution made more sense.

    Sisense integrates as *well as can be hoped* with the ODBC connector, is priced pretty reasonably (middle to lower area of the range of all options I evaluated), and stores all your data for you, essentially creating your own data warehouse (you sort of can run your own queries against the data, but moreso if needed you can create simple tabular data widgets to look at all your data if you want). Their team was also one of only 2-3 who were willing to work with us, pre-sale, to get a working demo **using our actual data** that I could demo to our CEO and key decision makers. Phocas and Qlik also were willing to do this, but for different reasons I didn’t end up getting that far with them. I guess I could have done it with Tableau as well, but it was a non-starter due to the problems with the data connections/ODBC.

    The problem really is the speed of the SuiteAnalytics Connect / ODBC module in NetSuite, at least for us. You can’t query against it reliably, so you need to pull it daily or something and store it locally. If I can ask Evan, how large is your dataset? We are also using lots of historical data in our transactions table to analyze trends, so I’m pulling in some 16 million rows from the TRANSACTION LINES table, which is the biggest. So, not terribly large. I’m guessing your company and dataset are somewhat bigger than ours since you were considering GoodData – in my evaluation, they were really expensive, FAR outside of our budget. Right now I’m downloading the data anew every night, because Sisense’s “data accumulation” feature doesn’t work with the ODBC connection, unfortunately. Even still, a full build/redownload of the data only takes about 1 – 1.5 hours.

    I’d be interested if you go the RESTlet route, how that works for you and especially how quickly you’re able to download the data.

    PS the schema from SuiteAnalytics is really annoying sometimes.

    Cheers,

    Alex


    Kathy Woll replied on 01/02/2017, 10:01 AM: Hi Alex. Our company has a connector-middle-ware that was developed by a NS partner about 5 years ago. In August, it will no longer work due to security changes with NS. I am told we need to rewrite the connector. I am not sure I want a connector, but want to explore other options, such as direct access to the NS data. We publish on-line information about approx 100,000 customers – we expose contact info, trainings attended (items bought in NS), etc so that the public can find a healthcare practitioner who practices the therapies we teach. http://www.iahp.com/pages/search/ We have to customize the data a bit after its pulled from NS – based on their status, membership level, etc. We have a web developer on staff (off site) who does that work for us (and who will help with the technical side for us). My question is – in addition to our current partner suggesting a solution for us, do you have any suggestions on who might offer an alternative idea? This is a big change and investment for us and we think we need to expend some due diligence to ensure we have more than one option and are making the right decision. I am thinking we cant be the first NS user to have to expose this level of data, at high speed for live searching. The connector has slowed to a crawl since our last update in September and often takes more than a day to update a few thousand records. Thanks!

  • #8888

    stephen.gray

    Excel (via ms query) is a great tool for running queries and using it to filter your data then report in excel. I’ve used it to run P&L’s, balance sheets, transaction lists etc. You can setup the parameters in your excel sheet e.g. the date range and then refresh the query each time you update these.

    Works great and saves you pulling out full tables at once. You can also use the group by clause to roll up the data for even bigger data sets.

    Added advantage is most people already have excel so it doesnt cost anymore and its still the most flexible BI tool out there!

  • #8889

    al3xicon

    stephen.gray with respect, I’ve tried using Excel with our ODBC connection and it simply doesn’t have the power we need. I imagine it would be OK for very small-sized businesses. FWIW, when I pull data out of the ODBC connector via Sisense I can specify my query as well, meaning I don’t have to pull full tables if I do not want to. Cheers.

  • #8890

    stephen.gray

    Hi @al3xicon

    Its not really anything to do with the size of the business. Its the amount of data you need to import thats important. A small manufacturing firm could have many more transactions in its database than a large professional services company.

    I’ve used it with small and large firms and just filtered and rolled up the data to that which i need for the specific report.

    If you need to pull large volumes of data for analysis then you’re right excel isnt the right tool but you’ve got to question whether you need to pull out all that data in the first place and if you’re actually using it in your BI reports.

  • #8891

    al3xicon

    Thanks stephen.gray –

    I didn’t mean to imply that “size” meant revenue necessarily, or number of users. Of course the amount of data is key. “Size of business” in this instance does translate to “size of data”.

    And if Excel works for you, that’s great. We are pretty happy with our choice. Considering what my CEO and CFO tell me they want to see, I am pulling only the data we need, but it’s a lot. NetSuite’s schema for SuiteAnalytics Connect unfortunately means we have to pull more than we need to on some tables. But still, a full download of every scrap of data we need only takes about 1.25 – 1.5 hours, and we have it running in a cluster so that it’s replicated, meaning we can have near-realtime data.

    I appreciate your input.

  • #8892

    trimblee

    On the topic of Excel: we used a tool called SmartView from a company called Solution 7 for about a year. I wouldn’t think of it as a BI alternative, but it was great as a way to manipulate NetSuite data within Excel. They’ve built a thoughtfully designed Excel plugin that allows you to query, filter, etc. The use case for us was external financial reporting, for which it worked great in the proof of concepts we did. Deal breaker for us ended up being that it required local software installation and a little too much handholding by the technology team.

    al3xicon Very interesting to hear about Sisense, thank you. Our BI/data warehouse project is on the back burner for now in preference to other projects, but I’ll update this thread when we get to it.

    Regarding the size of our data sets, on the corporate side it’s really not that big. We tried GoodData in part because our development team already uses it. That said, it’s still large enough that querying live via ODBC doesn’t work. The saving grace of Tableau’s data extract feature in this regard is that you can schedule nightly extracts so that users are reporting against data in our Tableau server instance instead of querying NetSuite via ODBC. And yeah, the SuiteAnalytics schema is totally annoying, especially if need to support Financial Analysts who are querying against NetSuite directly from Excel That said, NetSuite recently added it to the Schema Browser, which is great.

  • #8893

    al3xicon

    Thanks Evan – If you’d like to bounce any thoughts or questions re: Sisense off of me when you get to that point, feel free. Look forward to hearing your thoughts.

  • #8894

    savethepennies

    I’m happy to see a good discussion around this topic. As Alex mentioned, I have seen other threads but rarely much conversation. Surely there are others working through similar challenges of data availability and enhanced visualization. It would be nice to share ideas and solutions.

    We have been using the ODBC in a few ways for several years. Most of our BI efforts have been based around SSRS. More recently we have been playing around with the ODBC in Excel and Power Pivot as a way to further enable power and more tech/data savvy users. That seems promising for certain circumstances.

    Some of our reporting needs to be live so the ODBC is used directly. We have found plenty of limitations on performance and how queries can be written so it’s not the preferred option. I can share more details if others experiences are different. Several years ago I wrote a C# utility that uses the ODBC to cache data from specified tables nightly. Much of the time we’re okay with slightly stale data so we can write our queries against that data in a local SQL database.

    As our data has grown we’ve started looking for a more robust solution for our ETL type process. Currently we’re trying to implement a solution with SSIS. We’re running into some authentication issues so if anyone has gone down that road I’d be interested in hearing about your setup. I’m not sure how SSIS compares with some of the other providers mentioned here.

    That’s a slightly difference perspective so hopefully that adds to the conversation.

  • #8895

    al3xicon

    Thanks for your input savethepennies – I don’t know much about SSRS or SSIS, but it sounds like you have some specific needs. How much data are you working with out of curiosity?

    I’m glad to have more input on this, thanks!

  • #8896

    savethepennies

    Using transaction_lines as the measure, looks like we’re at about 1.6 million rows. As you mentioned, that seems to be the largest table.

  • #8897

    al3xicon

    Ah, that is a fairly small data set, then. ODBC for realtime reporting is probably a fairly viable option for you, though as you mention it’s not great speed-wise. IDK, if they could speed up the connection for ODBC, it would make it a *much* more attractive candidate. Of course, they are likely going to roll out their advanced analytics platform soon, I bet.

  • #8898

    savethepennies

    I really hope the advanced analytics comes soon. Based on what I’d heard at SuiteWorld 2015 we had hoped for 2016.1. Doesn’t look like that is the case. Improvements to saved searches/reporting in-product would have a big impact for users. I’m skeptical that it will do enough to completely replace the need for some external BI software but hopefully moves the needle on self service for less technical users.

  • #8899

    al3xicon

    Yeah I feel the same savethepennies – I’m signed up to attend the “future look” session re: SuiteAnalytics at SW2016 so hopefully I’ll learn what they’re planning there.

You must be logged in to reply to this topic.