ELF x@4( U$D$ÍvU$D$D$2D$$ D$ D$D$$TD$ D$;D$$@D$ $v'US1ۃ$|D$D$$ou$L$qUWVSQI`\D$/$3 1DžxDžtDžp5F{tx:xttoXt΋pXptF Ct%xxS-`CLtA{CF{LCx:x,&XF=$|=pGpnGoG-X$T$|t FF D$ D$bD$!$xpE}Džl|$D$ D$\L$`$}E ȀSw|$thЃH<.wT$$D$$ $D$$$ED$D$D$؋ <11ۡ\$A;du͋dlD$L$0to8-5uT$$PED$$uE%=D$$tD$0 $D$0 $D$$ D$$Džh5_uYC$tjC$hhEt$Et$EEupuUEt$D$ T$D$C$hhtE$m02ED$$D$E$2$D$$D$Pt)ED$T$D$hEupu d5$T$$$t$\$ D$$D$  t6;.|;s$$T$D$$D$t HthBd~1ۋ$;du1heY[^_]aAnsDn1=R$x;\\$5P E$ PD$$D$t&$ t&$D$$D$$tD$$$$$=$$ $$@$$D$'US]tt $$D$0 []Aa(b(3KDTT\B\anzck#+d+1>JcVPgq{DX!e)4>JUxcgFr|hXE $2<ES`lwI46il%2>m>EnHmSw]Om|oHp$8CRR[pYqQ'3r3==ERELLUbqS&5H@JJQapTpxNxtUPvPPPV wUsage: %s [OPTION]... [URL]... Copyright (C) 2005 Free Software Foundation, Inc. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details. Originally written by Hrvoje Niksic . GNU Wget %s, a non-interactive network retriever. Try `%s --help' for more options. Can't be verbose and quiet at the same time.Can't timestamp and not clobber old files at the same time.Cannot specify both --inet4-only and --inet6-only.DEBUG output created by Wget %s on %s. Removing file due to --delete-after in main(): FINISHED --%s-- Downloaded: %s bytes in %d files Download quota (%s bytes) EXCEEDED! Mandatory arguments to long options are mandatory for short options too. -V, --version display the version of Wget and exit. -h, --help print this help. -b, --background go to background after startup. -e, --execute=COMMAND execute a `.wgetrc'-style command. -o, --output-file=FILE log messages to FILE. -a, --append-output=FILE append messages to FILE. -d, --debug print lots of debugging information. -q, --quiet quiet (no output). -v, --verbose be verbose (this is the default). -nv, --no-verbose turn off verboseness, without being quiet. -i, --input-file=FILE download URLs found in FILE. -F, --force-html treat input file as HTML. -B, --base=URL prepends URL to relative links in -F -i file. -t, --tries=NUMBER set number of retries to NUMBER (0 unlimits). --retry-connrefused retry even if connection is refused. -O, --output-document=FILE write documents to FILE. -nc, --no-clobber skip downloads that would download to existing files. -c, --continue resume getting a partially-downloaded file. --progress=TYPE select progress gauge type. -N, --timestamping don't re-retrieve files unless newer than local. -S, --server-response print server response. --spider don't download anything. -T, --timeout=SECONDS set all timeout values to SECONDS. --dns-timeout=SECS set the DNS lookup timeout to SECS. --connect-timeout=SECS set the connect timeout to SECS. --read-timeout=SECS set the read timeout to SECS. -w, --wait=SECONDS wait SECONDS between retrievals. --waitretry=SECONDS wait 1..SECONDS between retries of a retrieval. --random-wait wait from 0...2*WAIT secs between retrievals. -Y, --proxy explicitly turn on proxy. --no-proxy explicitly turn off proxy. -Q, --quota=NUMBER set retrieval quota to NUMBER. --bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host. --limit-rate=RATE limit download rate to RATE. --no-dns-cache disable caching DNS lookups. --restrict-file-names=OS restrict chars in file names to ones OS allows. -4, --inet4-only connect only to IPv4 addresses. -6, --inet6-only connect only to IPv6 addresses. --prefer-family=FAMILY connect first to addresses of specified family, one of IPv6, IPv4, or none. --user=USER set both ftp and http user to USER. --password=PASS set both ftp and http password to PASS. -nd, --no-directories don't create directories. -x, --force-directories force creation of directories. -nH, --no-host-directories don't create host directories. --protocol-directories use protocol name in directories. -P, --directory-prefix=PREFIX save files to PREFIX/... --cut-dirs=NUMBER ignore NUMBER remote directory components. --http-user=USER set http user to USER. --http-password=PASS set http password to PASS. --no-cache disallow server-cached data. -E, --html-extension save HTML documents with `.html' extension. --ignore-length ignore `Content-Length' header field. --header=STRING insert STRING among the headers. --proxy-user=USER set USER as proxy username. --proxy-password=PASS set PASS as proxy password. --referer=URL include `Referer: URL' header in HTTP request. --save-headers save the HTTP headers to file. -U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION. --no-http-keep-alive disable HTTP keep-alive (persistent connections). --no-cookies don't use cookies. --load-cookies=FILE load cookies from FILE before session. --save-cookies=FILE save cookies to FILE after session. --keep-session-cookies load and save session (non-permanent) cookies. --post-data=STRING use the POST method; send STRING as the data. --post-file=FILE use the POST method; send contents of FILE. --ftp-user=USER set ftp user to USER. --ftp-password=PASS set ftp password to PASS. --no-remove-listing don't remove `.listing' files. --no-glob turn off FTP file name globbing. --no-passive-ftp disable the "passive" transfer mode. --retr-symlinks when recursing, get linked-to files (not dir). --preserve-permissions preserve remote file permissions. -r, --recursive specify recursive download. -l, --level=NUMBER maximum recursion depth (inf or 0 for infinite). --delete-after delete files locally after downloading them. -k, --convert-links make links in downloaded HTML point to local files. -K, --backup-converted before converting file X, back up as X.orig. -m, --mirror shortcut option equivalent to -r -N -l inf -nr. -p, --page-requisites get all images, etc. needed to display HTML page. --strict-comments turn on strict (SGML) handling of HTML comments. -A, --accept=LIST comma-separated list of accepted extensions. -R, --reject=LIST comma-separated list of rejected extensions. -D, --domains=LIST comma-separated list of accepted domains. --exclude-domains=LIST comma-separated list of rejected domains. --follow-ftp follow FTP links from HTML documents. --follow-tags=LIST comma-separated list of followed HTML tags. --ignore-tags=LIST comma-separated list of ignored HTML tags. -H, --span-hosts go to foreign hosts when recursive. -L, --relative follow relative links only. -I, --include-directories=LIST list of allowed directories. -X, --exclude-directories=LIST list of excluded directories. -np, --no-parent don't ascend to the parent directory. Mail bug reports and suggestions to . GNU Wget %s opt->argtype != -1main.co <= countof (long_options)01logfileverboseaddhostdirdirstructnoclobbernoparent%s: illegal option -- `-n%c' removelisting%s: missing URL freebsd7.2abwbRemoving %s. unlink: %s No URLs found in %s. SIGHUPSIGUSR1WTF?!acceptappend-outputbackgroundbackup-convertedbackupconvertedbackupsbasebind-addressbindaddresscacertificatecadirectorycachecertificatecertificatetypecheckcertificateclobberconnect-timeoutconnecttimeoutcontinueconvert-linksconvertlinkscookiescut-dirscutdirsdebugdelete-afterdeleteafterdirectoriesdirectory-prefixdirprefixdns-cachednscachedns-timeoutdnstimeoutdomainsdont-remove-listingdot-styledotstyleegd-fileegdfileexclude-directoriesexcludedirectoriesexclude-domainsexcludedomainsexecutefollow-ftpfollowftpfollow-tagsfollowtagsforce-directoriesforce-htmlforcehtmlftp-passwordftppasswordftp-userftpuserglobheaderhelphost-directorieshtml-extensionhtmlextensionhtmlifyhttp-keep-alivehttpkeepalivehttp-passwdhttppasswordhttp-passwordhttp-userhttpuserignore-lengthignorelengthignore-tagsignoretagsinclude-directoriesincludedirectoriesinet4-onlyinet4onlyinet6-onlyinet6onlyinput-fileinputkeep-session-cookieskeepsessioncookieslevelreclevellimit-ratelimitrateload-cookiesloadcookiesmirrornono-clobberno-parentoutput-documentoutputdocumentoutput-filepage-requisitespagerequisitesparentpassive-ftppassiveftppasswordpost-datapostdatapost-filepostfileprefer-familypreferfamilypreserve-permissionspreservepermissionsprivatekeyprivatekeytypeprogressprotocol-directoriesprotocoldirectoriesproxyuseproxyproxy__compatproxy-passwdproxypasswordproxy-passwordproxy-userproxyuserquietquotarandom-filerandomfilerandom-waitrandomwaitread-timeoutreadtimeoutrecursiverefererrejectrelativerelativeonlyremove-listingrestrict-file-namesrestrictfilenamesretr-symlinksretrsymlinksretry-connrefusedretryconnrefusedsave-cookiessavecookiessave-headerssaveheaderssecureprotocolserver-responseserverresponsespan-hostsspanhostsspiderstrict-commentsstrictcommentstimeouttimestampingtriesuseruser-agentuseragentversionwaitwaitretry Startup: Logging and input file: Download: Directories: HTTP options: FTP options: Recursive download: Recursive accept/reject: =nR9no,x(hLL0l8x( d 8 x  ` t H `4x@DX$p@$ p$h Dp \, T@(h<init_switchesGCC: (GNU) 4.2.1 20070719 [FreeBSD].symtab.strtab.shstrtab.rel.text.rel.data.bss.rodata.str1.4.rodata.str1.1.rel.rodata.comment@v  Jx )  % PR / 42uC23F V@= R X8 ^?&@gB  GM "O-7E R`Y`lz0 F  #/9F JQgpv !4Kdmz 2main.cprint_usageprint_versionprint_helphelp.4939short_optionslong_optionsoptmapp.4878buffer.4877__func__.4908redirect_output_signalexec_nameprintfversion_string__stdoutpfwriteexitfputsmainstrrchrinitializeoption_datastrlenstrcpy__assertgetopt_longoptargputcharrun_commandsetoptval_sch_toloweroptoptindrewrite_shorthand_urllog_initfopenoutput_stream__isthreadedfstatsignalprogress_handle_sigwinchurl_schemeretrieve_treechecking_freeretrieve_urlfile_exists_plogprintfunlink__errorstrerrorabortretrieve_from_filetotal_downloaded_byteswith_thousand_seps_largetime_strsave_cookieslog_closecleanupset_progress_implementationfork_to_backgroundchecking_strdupfilenooutput_stream_regularconvert_all_linksdebug_logprintfperrorputslog_request_redirect_outputprogress_schedule_redirect'.7<S\ax FM[`l"+ &-2!px"#$ )5:#B%X_h&p#'&#&# &#!*&Yku|&&&&&!.>EJ!U([(d(n(x(~(((((((()((>)O*c)(+(((,-./,09E0MY0m0u1023(44( (< 5G (V 6c (v  7 8 9 :  7 ; ( ( < ( (" =' =- (9 >G ?W g 7m (r (~ = = >  7 ( @ ( ( A 4 B# '1 > (G (Q (Y Cc Dh (|  -  ) E ( = = F (  7 G! HA $F M V b j o v  I ( J   I  K   K   K    2 H O W L\ Mg l 0 (4<HP\dpx$,8@LT`ht|(0<DPXdl  ,4@HT\hp|$08DLX`lt (4<HP\dpx$,8@LT`ht|(0<DPXdlx  ,4@HT\hp|  $(,048<@DHLPTX\`dhlptx|  $(,048<@DHLPTX\`dhlptx|  $(,048<@DHLPTX\`dhlptx|