Working with CSV Files from the Command Line

FAIL (the browser should render some flash content, not this).

How to extract and manipulate CSV data from the command line.

Download in .ogv format

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Thanks a lot! Such a concise

qubodup's picture

Thanks a lot! Such a concise and "let's get started without talking log" video helps me a lot to learn efficiently in the CLI!

I recommend the csv module

Anonymous's picture

I recommend the csv module in python.

http://docs.python.org/library/csv.html

It is really easy to use and handles fields with embedded newlines, quotes, commas with flying colors!

Give it a shot some time, you won't be disappointed :).

-TH

Big caveat

Administra.tion.ca's picture

Most CSV files I get these days have quoted strings containing commas inside the quoted field between the separators.

You'd better be very sure of your CSV contents if you're going to use cut like this.

It'll fail on this CSV file:

1,Cat food,55.69,05/04/2009
2,Power bill,149.75,03/04/2009
3,Rent,"1,350.00",01/04/2009

Use Grep

Mitch Frazier's picture

My original thinking was to use grep:

   grep -i '[^,]*@[^,]*' stuff.csv ...

to extract the email but I decided on cut to simplify the "look" of it a bit for the video.

Of course your sample doesn't include any emails so that wouldn't work... so you'd have to adapt/change it to extract the right field.

A perhaps more robust method is to convert the CSV file to use tabs rather than commas. With tabs cut should work most of the time. Still might be that you'd have quoted tabs, but unless your CSV came from a database dump that seems unlikely.

Mitch Frazier is an Associate Editor for Linux Journal.

How will grep help us parse?

Rob Russell's picture

That grep will identify lines that are CSV, but won't parse out the values. And yes, I deal with tons of huge CSV files that go in and out of databases, so it's important that I get my scripts right.

The only thing I've found that comes close to what I need is http://perlmeme.org/tutorials/parsing_csv.html -- so I'm in the process of rewriting that in to something bash can invoke from inside shell scripts.

Forgot an option

Mitch Frazier's picture

Forgot the -o option:

  grep -o -i '[^,]*@[^,]*' stuff.csv ...

that will show you only the part that matched.

Mitch Frazier is an Associate Editor for Linux Journal.

But that's still not "parsing"

Rob Russell 's picture

Sure, that'll extract an email address, but it doesn't fit any accepted definition of "parsing," since it assumes that you have prior knowledge of what's in the field and can't properly handle cases of having more than one @ in any line.

You're handling strings of text of which you can predict formatting, but you are certainly not using techniques that are transferrable to the general case of "I need to parse a CSV file."

That's Correct

Mitch Frazier's picture

You're 100% right, neither cut nor grep are generalized solutions for parsing CSV files, but then again I never said they were. As far as I know I never used the word parse, that was your word.

What I meant to say was that grep is both a floor wax and a dessert topping.

Mitch Frazier is an Associate Editor for Linux Journal.

No need for uniq

Gumnos's picture

GNU sort has a --unique (-u) argument that checks sorts and checks for uniqueness in the same pass. So you can just use "sort -uf" or "sort --ignore-case --unique" skipping uniq altogether.

-Tim

Thanks

Mitch Frazier's picture

Yes it does and that's why I always say one should re-RTFM once in a while to refresh one's memory and check for new features.

I might also add that one shouldn't jump to the the conclusion that the -u option makes uniq obsolete: uniq has some other useful options that come in handy once in a while. So re-RTFM applies to uniq also.

Mitch Frazier is an Associate Editor for Linux Journal.

Beat me to it

mattcen's picture

I'm glad somebody mentioned this; I found out about that a couple of months ago. Much better than having to pipe to yet another process :D

--
Regards,
Matthew Cengia

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState