Working with CSV Files from the Command Line

FAIL (the browser should render some flash content, not this).

How to extract and manipulate CSV data from the command line.

Download in .ogv format

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Thanks a lot! Such a concise

qubodup's picture

Thanks a lot! Such a concise and "let's get started without talking log" video helps me a lot to learn efficiently in the CLI!

I recommend the csv module

Anonymous's picture

I recommend the csv module in python.

http://docs.python.org/library/csv.html

It is really easy to use and handles fields with embedded newlines, quotes, commas with flying colors!

Give it a shot some time, you won't be disappointed :).

-TH

Big caveat

Administra.tion.ca's picture

Most CSV files I get these days have quoted strings containing commas inside the quoted field between the separators.

You'd better be very sure of your CSV contents if you're going to use cut like this.

It'll fail on this CSV file:

1,Cat food,55.69,05/04/2009
2,Power bill,149.75,03/04/2009
3,Rent,"1,350.00",01/04/2009

Use Grep

Mitch Frazier's picture

My original thinking was to use grep:

   grep -i '[^,]*@[^,]*' stuff.csv ...

to extract the email but I decided on cut to simplify the "look" of it a bit for the video.

Of course your sample doesn't include any emails so that wouldn't work... so you'd have to adapt/change it to extract the right field.

A perhaps more robust method is to convert the CSV file to use tabs rather than commas. With tabs cut should work most of the time. Still might be that you'd have quoted tabs, but unless your CSV came from a database dump that seems unlikely.

Mitch Frazier is an Associate Editor for Linux Journal.

How will grep help us parse?

Rob Russell's picture

That grep will identify lines that are CSV, but won't parse out the values. And yes, I deal with tons of huge CSV files that go in and out of databases, so it's important that I get my scripts right.

The only thing I've found that comes close to what I need is http://perlmeme.org/tutorials/parsing_csv.html -- so I'm in the process of rewriting that in to something bash can invoke from inside shell scripts.

Forgot an option

Mitch Frazier's picture

Forgot the -o option:

  grep -o -i '[^,]*@[^,]*' stuff.csv ...

that will show you only the part that matched.

Mitch Frazier is an Associate Editor for Linux Journal.

But that's still not "parsing"

Rob Russell 's picture

Sure, that'll extract an email address, but it doesn't fit any accepted definition of "parsing," since it assumes that you have prior knowledge of what's in the field and can't properly handle cases of having more than one @ in any line.

You're handling strings of text of which you can predict formatting, but you are certainly not using techniques that are transferrable to the general case of "I need to parse a CSV file."

That's Correct

Mitch Frazier's picture

You're 100% right, neither cut nor grep are generalized solutions for parsing CSV files, but then again I never said they were. As far as I know I never used the word parse, that was your word.

What I meant to say was that grep is both a floor wax and a dessert topping.

Mitch Frazier is an Associate Editor for Linux Journal.

No need for uniq

Gumnos's picture

GNU sort has a --unique (-u) argument that checks sorts and checks for uniqueness in the same pass. So you can just use "sort -uf" or "sort --ignore-case --unique" skipping uniq altogether.

-Tim

Thanks

Mitch Frazier's picture

Yes it does and that's why I always say one should re-RTFM once in a while to refresh one's memory and check for new features.

I might also add that one shouldn't jump to the the conclusion that the -u option makes uniq obsolete: uniq has some other useful options that come in handy once in a while. So re-RTFM applies to uniq also.

Mitch Frazier is an Associate Editor for Linux Journal.

Beat me to it

mattcen's picture

I'm glad somebody mentioned this; I found out about that a couple of months ago. Much better than having to pipe to yet another process :D

--
Regards,
Matthew Cengia

Webinar
One Click, Universal Protection: Implementing Centralized Security Policies on Linux Systems

As Linux continues to play an ever increasing role in corporate data centers and institutions, ensuring the integrity and protection of these systems must be a priority. With 60% of the world's websites and an increasing share of organization's mission-critical workloads running on Linux, failing to stop malware and other advanced threats on Linux can increasingly impact an organization's reputation and bottom line.

Learn More

Sponsored by Bit9

Webinar
Linux Backup and Recovery Webinar

Most companies incorporate backup procedures for critical data, which can be restored quickly if a loss occurs. However, fewer companies are prepared for catastrophic system failures, in which they lose all data, the entire operating system, applications, settings, patches and more, reducing their system(s) to “bare metal.” After all, before data can be restored to a system, there must be a system to restore it to.

In this one hour webinar, learn how to enhance your existing backup strategies for better disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible bare-metal recovery solution for UNIX and Linux systems.

Learn More

Sponsored by Storix