wget question

I will like to get a list of the files in the directory of an ftp site.
can I use wget to do it?

if yes, the how?

any other method that I can easily implement in bash script is welcome also.


You could also do this using

Anonymous's picture

You could also do this using perl's Net::FTP module:

use strict;
use warnings;
use Net::FTP;

#Change your.ftp-host.com to your ftp domain
my $ftp = Net::FTP->new("your.ftp-host.com", Debug => 0)
        or die "Can not connect: $@";
#Change UserName and PassWord to your login information
        or die "Can not log in: ", $ftp->message;
#If you want to list your root directory you can comment this out
#Or change public_html to where you want to get a file list

my @list = $ftp->ls($ftp->pwd);
my $listing = "list.txt";

open(LISTING, ">$listing");

foreach my $line (@list) {
        print LISTING $line . "\n";


Use "ftp"

Mitch Frazier's picture

Use the ftp command. For example, to get the items in /pub/lj/listings from the LJ  ftp server:

ftp ftp://ftp.linuxjournal.com <<EOF
dir /pub/lj/listings filelist.tmp
sed -e 's/.* //' <filelist.tmp >filelist.txt

This will put the names into filelist.txt. Note that the sed command will produce incorrect results if any of the file names contain blanks. In that case you'll have to use a different method of extracting the file names from the raw results file (filelist.tmp).

Mitch Frazier is an Associate Editor for Linux Journal.

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState