download.file {utils} | R Documentation |
Download File from the Internet
Description
This function can be used to download a file from the Internet.
Usage
download.file(url, destfile, method, quiet = FALSE, mode = "w",
cacheOK = TRUE,
extra = getOption("download.file.extra"),
headers = NULL, ...)
Arguments
url |
a |
destfile |
a character string (or vector, see the |
method |
Method to be used for downloading files. Current
download methods are The method can also be set through the option
|
quiet |
If |
mode |
character. The mode with which to write the file. Useful
values are |
cacheOK |
logical. Is a server-side cached value acceptable? |
extra |
character vector of additional command-line arguments for
the |
headers |
named character vector of additional HTTP headers to
use in HTTP[S] requests. It is ignored for non-HTTP[S] URLs. The
|
... |
allow additional arguments to be passed, unused. |
Details
The function download.file
can be used to download a single
file as described by url
from the internet and store it in
destfile
.
The url
must start with a scheme such as ‘http://’,
‘https://’ or ‘file://’. Which methods support which
schemes varies by R version, but method = "auto"
will try to
find a method which supports the scheme.
For method = "auto"
(the default) currently the
"internal"
method is used for ‘file://’ URLs and
"libcurl"
for all others.
Support for method "libcurl"
was optional on Windows prior to
R 4.2.0: use capabilities("libcurl")
to see if it is
supported on an earlier version. It uses an external library of that
name (https://curl.se/libcurl/) against which R can be
compiled.
When method "libcurl"
is used, there is support for
simultaneous downloads, so url
and destfile
can be
character vectors of the same length greater than one (but the method
has to be specified explicitly and not via "auto"
). For
a single URL and quiet = FALSE
a progress bar is shown in
interactive use.
Nowadays the "internal"
method only supports the ‘file://’
scheme (for which it is the default). On Windows the "wininet"
method currently supports ‘file://’ and (but deprecated with a
warning) ‘http://’ and ‘https://’ schemes.
For methods "wget"
and "curl"
a system call is made to
the tool given by method
, and the respective program must be
installed on your system and be in the search path for executables.
They will block all other activity on the R process until they
complete: this may make a GUI unresponsive.
cacheOK = FALSE
is useful for ‘http://’ and
‘https://’ URLs: it will attempt to get a copy directly from the
site rather than from an intermediate cache. It is used by
available.packages
.
The "libcurl"
and "wget"
methods follow ‘http://’
and ‘https://’ redirections to any scheme they support. (For
method "curl"
use argument extra = "-L"
. To disable
redirection in wget
, use extra = "--max-redirect=0"
.)
The "wininet"
method supports some redirections but not all.
(For method "libcurl"
, messages will quote the endpoint of
redirections.)
See url
for how ‘file://’ URLs are interpreted,
especially on Windows. The "internal"
and "wininet"
methods do not percent-decode, but the "libcurl"
and
"curl"
methods do: method "wget"
does not support them.
Most methods do not percent-encode special characters such as spaces
in URLs (see URLencode
), but it seems the
"wininet"
method does.
The remaining details apply to the "wininet"
and
"libcurl"
methods only.
The timeout for many parts of the transfer can be set by the option
timeout
which defaults to 60 seconds. This is often
insufficient for downloads of large files (50MB or more) and
so should be increased when download.file
is used in packages
to do so. Note that the user can set the default timeout by the
environment variable R_DEFAULT_INTERNET_TIMEOUT in recent
versions of R, so to ensure that this is not decreased packages should
use something like
options(timeout = max(300, getOption("timeout")))
(It is unrealistic to require download times of less than 1s/MB.)
The level of detail provided during transfer can be set by the
quiet
argument and the internet.info
option: the details
depend on the platform and scheme. For the "libcurl"
method
values of the option less than 2 give verbose output.
A progress bar tracks the transfer platform-specifically:
- On Windows
If the file length is known, the full width of the bar is the known length. Otherwise the initial width represents 100 Kbytes and is doubled whenever the current width is exceeded. (In non-interactive use this uses a text version. If the file length is known, an equals sign represents 2% of the transfer completed: otherwise a dot represents 10Kb.)
- On a Unix-alike
If the file length is known, an equals sign represents 2% of the transfer completed: otherwise a dot represents 10Kb.
The choice of binary transfer (mode = "wb"
or "ab"
) is
important on Windows, since unlike Unix-alikes it does distinguish
between text and binary files and for text transfers changes ‘\n’
line endings to ‘\r\n’ (aka ‘CRLF’).
On Windows, if mode
is not supplied (missing()
)
and url
ends in one of ‘.gz’, ‘.bz2’, ‘.xz’,
‘.tgz’, ‘.zip’, ‘.jar’, ‘.rda’, ‘.rds’,
‘.RData’ or ‘.pdf’, mode = "wb"
is set so that a binary
transfer is done to help unwary users.
Code written to download binary files must use mode = "wb"
(or
"ab"
), but the problems incurred by a text transfer will only
be seen on Windows.
Value
An (invisible) integer code, 0
for success and non-zero for
failure. For the "wget"
and "curl"
methods this is the
status code returned by the external program. The "internal"
method can return 1
, but will in most cases throw an error.
What happens to the destination file(s) in the case of error depends
on the method and R version. Currently the "internal"
,
"wininet"
and "libcurl"
methods will remove the file if
the URL is unavailable except when mode
specifies
appending when the file should be unchanged.
Setting Proxies
For the Windows-only method "wininet"
, the ‘Internet
Options’ of the system are used to choose proxies and so on; these are
set in the Control Panel and are those used for system browsers.
For the "libcurl"
and "curl"
methods, proxies can be set
via the environment variables http_proxy or
ftp_proxy. See
https://curl.se/libcurl/c/libcurl-tutorial.html for further
details.
Secure URLs
Methods which access ‘https://’ and (where supported) ‘ftps://’ URLs should try to verify the site certificates. This is usually done using the CA root certificates installed by the OS (although we have seen instances in which these got removed rather than updated). For further information see https://curl.se/docs/sslcerts.html.
On Windows with method = "libcurl"
, the CA root certificates
are provided by the OS when R was linked with libcurl
with
Schannel
enabled, which is the current default in Rtools. This
can be verified by checking that libcurlVersion()
returns a
version string containing ‘"Schannel"’. If it does not, for
verification to be on the environment variable CURL_CA_BUNDLE
must be set to a path to a certificate bundle file, usually named
‘ca-bundle.crt’ or ‘curl-ca-bundle.crt’. (This is normally
done automatically for a binary installation of R, which installs
‘R_HOME/etc/curl-ca-bundle.crt’ and sets
CURL_CA_BUNDLE to point to it if that environment variable is
not already set.) For an updated certificate bundle, see
https://curl.se/docs/sslcerts.html. Currently one can download
a copy from
https://raw.githubusercontent.com/bagder/ca-bundle/master/ca-bundle.crt
and set CURL_CA_BUNDLE to the full path to the downloaded file.
On Windows with method = "libcurl"
, when R was linked with
libcurl
with Schannel
enabled, the connection fails if it
cannot be established that the certificate has not been revoked. Some
MITM proxies present particularly in corporate environments do not work
with this behavior. It can be changed by setting environment variable
R_LIBCURL_SSL_REVOKE_BEST_EFFORT to TRUE
, with the
consequence of reducing security.
Note that the root certificates used by R may or may not be the same as used in a browser, and indeed different browsers may use different certificate bundles (there is typically a build option to choose either their own or the system ones).
Good practice
Setting the method
should be left to the end user. Neither of
the wget
nor curl
commands is widely available:
you can check if one is available via Sys.which
,
and should do so in a package or script.
If you use download.file
in a package or script, you must check
the return value, since it is possible that the download will fail
with a non-zero status but not an R error.
The supported method
s do change: method libcurl
was
introduced in R 3.2.0 and was optional on Windows until R 4.2.0 –
use capabilities("libcurl")
in a program to see if it is
available.
‘ftp://’ URLs
Most modern browsers do not support such URLs, and ‘https://’ ones are much preferred for use in R. ‘ftps://’ URLs have always been rare, and are nowadays even less supported.
It is intended that R will continue to allow such URLs for as long as
libcurl
does, but as they become rarer this is increasingly
untested. What ‘protocols’ the version of libcurl
being used supports can be seen by calling libcurlVersion()
.
These URLs are accessed using the FTP protocol which has a
number of variants. One distinction is between ‘active’ and
‘(extended) passive’ modes: which is used is chosen by the
client. The "libcurl"
method uses passive mode which was
almost universally used by browsers before they dropped support
altogether.
Note
Files of more than 2GB are supported on 64-bit builds of R; they may be truncated on some 32-bit builds.
Methods "wget"
and "curl"
are mainly for historical
compatibility but provide may provide capabilities not supported by
the "libcurl"
or "wininet"
methods.
Method "wget"
can be used with proxy firewalls which require
user/password authentication if proper values are stored in the
configuration file for wget
.
wget
(https://www.gnu.org/software/wget/) is commonly
installed on Unix-alikes (but not macOS). Windows binaries are
available from MSYS2 and elsewhere.
curl
(https://curl.se/) is installed on macOS and
increasingly commonly on Unix-alikes. Windows binaries are available
at that URL.
See Also
options
to set the HTTPUserAgent
, timeout
and internet.info
options used by some of the methods.
url
for a finer-grained way to read data from URLs.
url.show
, available.packages
,
download.packages
for applications.
Contributed packages RCurl and curl provide more comprehensive facilities to download from URLs.