getWikiFiles {wikiTools} | R Documentation |
Downloads a list of Wikipedia pages in a specified path of the computer, and return a vector of the no-found names (if any).
Description
Downloads a list of Wikipedia pages in a specified path of the computer, and return a vector of the no-found names (if any).
Usage
getWikiFiles(X, language = c("es", "en", "fr"), directory = "./", maxtime = 0)
Arguments
X |
A vector of Wikipedia's entry). |
language |
The language of the Wikipedia page version. This should consist of an ISO language code (default = "en"). |
directory |
Directory where to export the files to. |
maxtime |
In case you want to apply a random waiting between consecutive searches. |
Details
This function allows download a set of Wikipedia pages into a directory of the local computer. All the errors (not found pages) are reported as outcomes (NULL= no errors). The files are donwload into your chosen directory.
Value
It returns a vector of errors, if any. All pictures are download into the selected directory (NULL= no errors).
Author(s)
Modesto Escobar, Department of Sociology and Communication, University of Salamanca. See https://sociocav.usal.es/blog/modesto-escobar/
Examples
## Not run:
## In case you want to download the Wikipage of a person:
# getWikiFiles("Rembrandt", dir = "./")
## Or the pics of multiple authors:
# B <- c("Monet", "Renoir", "Caillebotte")
# getWikiFiles(B, dir = "./", language="fr")
## End(Not run)