tokenize_folder {fastai}R Documentation

Tokenize_folder

Description

Tokenize text files in 'path' in parallel using 'n_workers'

Usage

tokenize_folder(
  path,
  extensions = NULL,
  folders = NULL,
  output_dir = NULL,
  skip_if_exists = TRUE,
  output_names = NULL,
  n_workers = 6,
  rules = NULL,
  tok = NULL,
  encoding = "utf8"
)

Arguments

path

path

extensions

extensions

folders

folders

output_dir

output_dir

skip_if_exists

skip_if_exists

output_names

output_names

n_workers

number of workers

rules

rules

tok

tokenizer

encoding

encoding

Value

None


[Package fastai version 2.2.2 Index]