Client-Side Web Scripting
The opposite approach, i.e., starting a generic mirroring or image-fetching script from your browser, is possible in Konqueror (or even KMail) during normal browsing. If you right click on a link and select the “Open with..” option, it will let you enter the path of the script to be used and add it to the choices next time. This means you can prepare a mirror or fetch_images script following the instructions given here and start it in the background on any URL you wish with a couple of clicks.
The URL list contained in the @ALL_URLS array also can be used to start mirroring or (parallel) FTP sessions. This can be done entirely in Perl, using the many FTP and mirroring modules available, or simply by collecting the URLs to be mirrored or fetched by FTP, and leaving the actual work to wget or curl, as explained in A. J. Chung's article, “Downloading without a Browser” (see Resources).
If your favorite web portal chooses a different cool site every day, and you want your PC to mirror it for you, just fetch the URL as you would do for images, and then say in your script:
exec "wget -m -L -t 5 $COMPLETE_URL";
All the commands for parallel FTP and mirroring explained in Chung's article can be started in this way from a Perl script, having as arguments the URLs found by this one.
Many of us have more than one favorite site and would like to have them all in the same window. A general solution for this is to extract the complete HTML body of each page in this way:
$HTML_FILE = s/^.*<body[^>]*>//i; # strips everything # before $HTML_FILE = s/<\/body[^>]*>.*$//i; # strips everything # after
and then print out an HTML table with each original page in each box:
print<<END_TABLE; ....All HTML <HEAD> and <BODY> stuff here <TABLE> <TR><TD>$HTML_FILE_1</TD></TR> <TR><TD>$HTML_FILE_2</TD></TR> ......... </TABLE></BODY></HTML> END_TABLESave the script output in $HOME/.myportal.html, set that file as your starting page in your browser and enjoy! The complete script will probably require quite some tweaking to clean up different CSSes, fonts and so on, but you know how to do it by now, right?
We have barely scratched the surface of client-side web scripting. Much more sophisticated tasks are possible, such as dealing with cookies and password-protected sites, automatic form submission, web searches with all the criteria you can think about, scanning a whole web site and displaying the ten most-pointed-to URLs in a histogram, and web-mail checking.
You only need some patience, Perl practice and a good knowledge of the relevant modules to succeed. Good browsing!
Articles about Digital Rights and more at http://stop.zona-m.net CV, talks and bio at http://mfioretti.com
- High-Availability Storage with HA-LVM
- DNSMasq, the Pint-Sized Super Dæmon!
- March 2015 Issue of Linux Journal: System Administration
- Localhost DNS Cache
- Real-Time Rogue Wireless Access Point Detection with the Raspberry Pi
- Days Between Dates: the Counting
- The Usability of GNOME
- PostgreSQL, the NoSQL Database
- Linux for Astronomers
- You're the Boss with UBOS