Commit a48ec60c authored by Glenn Morris's avatar Glenn Morris
Browse files

Fix previous URL doc change

* lisp/url/url-queue.el (url-queue-retrieve): Fic previous doc fix.

* doc/misc/url.texi (Retrieving URLs): Update url-retrieve arguments.
Mention url-queue-retrieve.

* etc/NEWS: Related edit.
parent 578ad769
......@@ -216,10 +216,10 @@ non-@code{nil}, do not store or send cookies.
@vindex url-queue-parallel-processes
@vindex url-queue-timeout
@defun url-queue-retrieve url callback &optional cbargs silent no-cookies
This acts like the @code{url-retrieve} function, but downloads in
parallel. The option @code{url-queue-parallel-processes} controls the
number of concurrent processes, and the option @code{url-queue-timeout}
sets a timeout in seconds.
This acts like the @code{url-retrieve} function, but with limits on
the degree of parallelism. The option @code{url-queue-parallel-processes}
controls the number of concurrent processes, and the option
@code{url-queue-timeout} sets a timeout in seconds.
@end defun
@node Supported URL Types
......
......@@ -858,8 +858,9 @@ default value to "".
remote machines that support SELinux.
+++
** New function, url-queue-retrieve, fetches URLs asynchronously like
url-retrieve does, but in parallel.
** New function, `url-queue-retrieve', which behaves like url-retrieve,
but with limits (`url-queue-parallel-processes', `url-queue-timeout') on
the degree of parallelism.
** VC and related modes
......
2012-02-10 Glenn Morris <rgm@gnu.org>
* url-queue.el (url-queue-retrieve): Fic previous doc fix.
2012-02-10 Andreas Schwab <schwab@linux-m68k.org>
* url-http.el (url-http-clean-headers): Return the number of
......
......@@ -57,9 +57,9 @@
(defun url-queue-retrieve (url callback &optional cbargs silent inhibit-cookies)
"Retrieve URL asynchronously and call CALLBACK with CBARGS when finished.
This is like `url-retrieve' (which see for details of the arguments),
but downloads in parallel. The variable `url-queue-parallel-processes'
sets the number of concurrent processes. The variable `url-queue-timeout'
sets a timeout."
but with limits on the degree of parallelism. The variable
`url-queue-parallel-processes' sets the number of concurrent processes.
The variable `url-queue-timeout' sets a timeout."
(setq url-queue
(append url-queue
(list (make-url-queue :url url
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment