Client-Side Web Scripting
The opposite approach, i.e., starting a generic mirroring or image-fetching script from your browser, is possible in Konqueror (or even KMail) during normal browsing. If you right click on a link and select the “Open with..” option, it will let you enter the path of the script to be used and add it to the choices next time. This means you can prepare a mirror or fetch_images script following the instructions given here and start it in the background on any URL you wish with a couple of clicks.
The URL list contained in the @ALL_URLS array also can be used to start mirroring or (parallel) FTP sessions. This can be done entirely in Perl, using the many FTP and mirroring modules available, or simply by collecting the URLs to be mirrored or fetched by FTP, and leaving the actual work to wget or curl, as explained in A. J. Chung's article, “Downloading without a Browser” (see Resources).
If your favorite web portal chooses a different cool site every day, and you want your PC to mirror it for you, just fetch the URL as you would do for images, and then say in your script:
exec "wget -m -L -t 5 $COMPLETE_URL";
All the commands for parallel FTP and mirroring explained in Chung's article can be started in this way from a Perl script, having as arguments the URLs found by this one.
Many of us have more than one favorite site and would like to have them all in the same window. A general solution for this is to extract the complete HTML body of each page in this way:
$HTML_FILE = s/^.*<body[^>]*>//i; # strips everything # before $HTML_FILE = s/<\/body[^>]*>.*$//i; # strips everything # after
and then print out an HTML table with each original page in each box:
print<<END_TABLE; ....All HTML <HEAD> and <BODY> stuff here <TABLE> <TR><TD>$HTML_FILE_1</TD></TR> <TR><TD>$HTML_FILE_2</TD></TR> ......... </TABLE></BODY></HTML> END_TABLESave the script output in $HOME/.myportal.html, set that file as your starting page in your browser and enjoy! The complete script will probably require quite some tweaking to clean up different CSSes, fonts and so on, but you know how to do it by now, right?
We have barely scratched the surface of client-side web scripting. Much more sophisticated tasks are possible, such as dealing with cookies and password-protected sites, automatic form submission, web searches with all the criteria you can think about, scanning a whole web site and displaying the ten most-pointed-to URLs in a histogram, and web-mail checking.
You only need some patience, Perl practice and a good knowledge of the relevant modules to succeed. Good browsing!
Articles about Digital Rights and more at http://stop.zona-m.net CV, talks and bio at http://mfioretti.com
- Readers' Choice Awards 2013
- Mars Needs Women
- RSS Feeds
- New Products
- Sublime Text: One Editor to Rule Them All?
- December 2013 Issue of Linux Journal: Readers' Choice
- Raspberry Pi: the Perfect Home Server
- IBM Will Minimize Impact of Future Disasters
- Multi-Booting the Nexus 7 Tablet
- Senior Perl Developer
- why is GNOME 3 in the fifth position at 14.1 %?
51 min 9 sec ago
- Sublime Is Brilliant!
5 hours 53 min ago
6 hours 13 min ago
- Rapid[Disk,Cache] better than native ram caching?
6 hours 38 min ago
- Nothing is perfect
6 hours 51 min ago
- Mixtapes Community
12 hours 30 min ago
- KDE is one true DE
13 hours 4 min ago
- Command Line Shells (Bash, Zsh, etc.) are 2nd place
13 hours 33 min ago
15 hours 28 min ago
- yes it's Jupiter Broadcasting
16 hours 47 min ago