Skip to main content

API and bulk data uploads



  • Jason Harris

    Thanks for the question Swayne!

    We don’t offer this right now but we’ve noted it in our ‘ideas’ list for future development.

  • mmorganfield

    I think this is getting to the same goal as my question so I’ll put this here. How do you recommend going about backfilling data? Redshift recommends using the ‘COPY’ command, however that requires quite a few steps and an external host and I’m not sure how to technically go about that.

    What would be your recommended approach for adding, for instance, 4 months worth of FB Page data into an existing FB Page table?


  • mmorganfield

    Would it just be ‘insert into select’?

  • Alon Brody

    That’s a good question Max. I would say that the simplest way would be:

    1. Export the data from FB into a CSV
    2. Either use our File Upload, S3 or Google Drive data sources to load the file to your account. A few important notes are:
    • Set the destination to a different destination from what your current FB pages data sits. The main reason is because the CSV metadata is probably not the same as the table’s metadata (headers/types/etc)
    • Set a primary key in the data source level so you will not have duplicates in your table
    1. Once the data is loaded into your “temp” table use a INSERT INTO query to insert data from the “temp” table into the main FB pages tables. You might want to use a filter that will make sure you’re not ingesting data that already exist in the main table. Note that you should make sure to combine the relevant fields together in order to prevent data misuse
    2. Once happy, drop the “temp” table as all the relevant data will be stored in the main FB pages table.

Please sign in to leave a comment.

Powered by Zendesk