tokenize_files {fastai} | R Documentation |
Tokenize_files
Description
Tokenize text 'files' in parallel using 'n_workers'
Usage
tokenize_files(
files,
path,
output_dir,
output_names = NULL,
n_workers = 6,
rules = NULL,
tok = NULL,
encoding = "utf8",
skip_if_exists = FALSE
)
Arguments
files |
files |
path |
path |
output_dir |
output_dir |
output_names |
output_names |
n_workers |
n_workers |
rules |
rules |
tok |
tokenizer |
encoding |
encoding |
skip_if_exists |
skip_if_exists |
Value
None
[Package fastai version 2.2.2 Index]