auto_paginate_query {shroomDK}R Documentation

Auto Paginate Queries

Description

Intelligently grab up to 1 Gigabyte of data from a SQL query including automatic pagination and cleaning.

Usage

auto_paginate_query(
  query,
  api_key,
  page_size = 25000,
  page_count = NULL,
  data_source = "snowflake-default",
  data_provider = "flipside",
  api_url = "https://api-v2.flipsidecrypto.xyz/json-rpc"
)

Arguments

query

The SQL query to pass to ShroomDK

api_key

ShroomDK API key.

page_size

Default 25,000. May return error if 'page_size' is too large (if page exceeds 30MB or entire query >1GB). Ignored if results fit on 1 page of < 15 Mb of data.

page_count

How many pages, of page_size rows each, to read. Default NULL calculates the ceiling (# rows in results / page_size). Ignored if results fit on 1 page of < 15 Mb of data.

data_source

Where data is sourced, including specific computation warehouse. Default '"snowflake-default"'. Non default data sources may require registration of api_key to allowlist.

data_provider

Who provides data, Default '"flipside"'. Non default data providers may require registration of api_key to allowlist.

api_url

default to https://api-v2.flipsidecrypto.xyz/json-rpc but upgradeable for user.

Value

data frame of up to 'page_size * page_count' rows, see ?clean_query for more details on column classes.

Examples

## Not run: 
pull_data <- auto_paginate_query("
SELECT * FROM ETHEREUM.CORE.FACT_TRANSACTIONS LIMIT 10001",
api_key = readLines("api_key.txt"),
page_size = 9000, # ends up ignored because results fit on 1 page.
page_count = NULL)

## End(Not run)

[Package shroomDK version 0.3.0 Index]