from_sdf {sparklyr.flint} | R Documentation |
Construct a TimeSeriesRDD from a Spark DataFrame
Description
Construct a TimeSeriesRDD containing time series data from a Spark DataFrame
Usage
from_sdf(
sdf,
is_sorted = FALSE,
time_unit = .sparklyr.flint.globals$kValidTimeUnits,
time_column = .sparklyr.flint.globals$kDefaultTimeColumn
)
fromSDF(
sdf,
is_sorted = FALSE,
time_unit = .sparklyr.flint.globals$kValidTimeUnits,
time_column = .sparklyr.flint.globals$kDefaultTimeColumn
)
Arguments
sdf |
A Spark DataFrame object |
is_sorted |
Whether the rows being imported are already sorted by time |
time_unit |
Time unit of the time column (must be one of the following values: "NANOSECONDS", "MICROSECONDS", "MILLISECONDS", "SECONDS", "MINUTES", "HOURS", "DAYS" |
time_column |
Name of the time column |
Value
A TimeSeriesRDD useable by the Flint time series library
See Also
Other Spark dataframe utility functions:
collect.ts_rdd()
,
from_rdd()
,
spark_connection.ts_rdd()
,
spark_dataframe.ts_rdd()
,
spark_jobj.ts_rdd()
,
to_sdf()
,
ts_rdd_builder()
Other Spark dataframe utility functions:
collect.ts_rdd()
,
from_rdd()
,
spark_connection.ts_rdd()
,
spark_dataframe.ts_rdd()
,
spark_jobj.ts_rdd()
,
to_sdf()
,
ts_rdd_builder()
Examples
library(sparklyr)
library(sparklyr.flint)
sc <- try_spark_connect(master = "local")
if (!is.null(sc)) {
sdf <- copy_to(sc, tibble::tibble(t = seq(10), v = seq(10)))
ts <- from_sdf(sdf, is_sorted = TRUE, time_unit = "SECONDS", time_column = "t")
} else {
message("Unable to establish a Spark connection!")
}
[Package sparklyr.flint version 0.2.2 Index]