Use curl to Monitor Your Vonage Phone Bill

 in

If you're a Vonage user and you'd like to keep tabs on your bill as the month progresses, the script described here can help. The script uses curl to login to your Vonage account and download the web page with your current balance. The balance is then extracted using grep and sed.

Downloading web pages with curl is fairly easy, it gets a bit tricky though when you need to login to a web site before you can get to the web page you want to download. The basic sequence of steps to get to a page behind a login page using curl is:

  • Determine the login URL (the page with the login form on it).
  • Open the login page in your browser.
  • Use the browser's "View Source" option to look at the HTML and locate the login form. Make note of the URL that the form posts to and the names of the fields that get posted. Generally there are only 2 fields: username and password (although they might have different names), but there may also be some hidden fields that you need to include (as well as understand what they are and what is an appropriate value to send).
  • Retrieve the login page with curl. Note, that it may not actually be necessary to retrieve this page but if the page that the form posts to expects some cookies to be set, you may need to retrieve this first page so that you can create and save the cookies (which curl does for you).
  • Invoke curl a second time and post the login data.
  • Now invoke curl a third time passing the URL of the actual page that you want to retrieve.
  • If the site has a logout link you can optionally also use curl a fourth time to retrieve the logout page to ensure that the session is closed.

The curl command below retrieves the Vonage login page:

curl --silent --cookie-jar $cookie_jar \
    --output $web_page-1 \
    http://www.vonage.com/?login

Note the --cookie-jar option, this stores any cookies required by the website in the specified file. The file to store the retrieved page is specified by the --output option.

The curl command below now posts the login data required by the login form:

curl --silent --cookie $cookie_jar --cookie-jar $cookie_jar \
    --location \
    --data "username=$username&password=$password" \
    --output $web_page-2 \
    https://secure.vonage.com/vonage-web/public/login.htm

Notice here that in addition to --cookie-jar option we also specify the --cookie option. This tells curl to use the cookie jar that we created in the first invocation as input for this invocation. We also, specify the --location option so that any redirects sent by the page are followed. The actual data to post is specified with the --data option. The values before the equals signs in the data are the fields names from the login form, the values after are the appropriate field values to post.

The following two curl commands now retrieve the billing page and logout from Vonage:

curl --silent --cookie $cookie_jar --cookie-jar $cookie_jar \
    --output $web_page-3 \
    https://secure.vonage.com/webaccount/billing/index.htm

curl --silent --cookie $cookie_jar --cookie-jar $cookie_jar \
    --location \
    --output $web_page-4 \
    https://secure.vonage.com/webaccount/public/logoff.htm

All that's left now is to extract the account balance from the billing page (the third page that we retrieved). After looking at the returned HTML, I was able to see where the data I wanted was located and determine a way to filter out all the extraneous information using grep and sed:

echo Phone bill: $(grep 'td_value_total_amount' $web_page-3 | sed -e 's/.*>\$//' -e 's/<.*//')

The following shows a sample run of the script:

$ sh check-vonage.sh


Phone bill: 12.50

The entire script follows:

#!/bin/bash

cookie_jar=cookies.tmp
web_page=vonage.tmp
username=USERNAME
password=PASSWORD

trap "rm -f $cookie_jar $web_page-*" EXIT

curl --silent --cookie-jar $cookie_jar \
    --output $web_page-1 \
    http://www.vonage.com/?login

curl --silent --cookie $cookie_jar --cookie-jar $cookie_jar \
    --location \
    --data "username=$username&password=$password" \
    --output $web_page-2 \
    https://secure.vonage.com/vonage-web/public/login.htm

curl --silent --cookie $cookie_jar --cookie-jar $cookie_jar \
    --output $web_page-3 \
    https://secure.vonage.com/webaccount/billing/index.htm

curl --silent --cookie $cookie_jar --cookie-jar $cookie_jar \
    --location \
    --output $web_page-4 \
    https://secure.vonage.com/webaccount/public/logoff.htm

echo
echo
echo Phone bill: $(grep 'td_value_total_amount' $web_page-3 | sed -e 's/.*>\$//' -e 's/<.*//')
AttachmentSize
check-vonage.sh_.txt843 bytes
______________________

Mitch Frazier is an Associate Editor for Linux Journal.

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

tip to get the --data string

Anonymous's picture

good article, here's a tip to get the --data string.

1. use lynx to download source code of login page:
lynx --source "http://loginpage.com" > some_file
(you might have to edit lynx.cfg to accept self signed ssl certs)

2 open up the login file with vim and change method=post to method=get and save file

3 use your web browser to open up the edited login file and fill in the fields and hit the submit button.
The string for the --data option will be in the address bar of your web browser after the ?.

using this method saves you from missing hidden fields.

Interesting idea. Please consider using mktemp

R A Lichtensteiger's picture

Very nice example of using curl to cull information from a web site. I do have one quibble -- instead of defining your variables like this:

cookie_jar=cookies.tmp
web_page=vonage.tmp

I recommend you use mktemp(1) to create a randomized directory and put your files in there. This avoids predictable file names containing potentially sensitive information:

ME=`basename $0`
TMPDIR=`mktemp -qt -d ${ME}.XXXXXX`
if [ $? -ne 0 ]
then
echo "${ME} Error: unable to create temp directory"
exit 1
fi

COOKIEJAR=${TMPDIR}/cookies
PAGEBASE=${TMPDIR}/vonage-out

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState