Packet Sniffing Basics
Imagine this: you're sitting in your local coffee shop sucking down your morning caffeine fix before heading into the office. You catch up on your work e-mail, you check Facebook and you upload that financial report to your company's FTP server. Overall, it's been a constructive morning. By the time you get to work, there's a whirlwind of chaos throughout the office. That incredibly sensitive financial report you uploaded was somehow leaked to the public, and your boss is outraged by the crass and unprofessional e-mail you just sent him. Was there some hacker lurking in the shadows that broke into your company's network and decided to lay the blame on you? More than likely not. This mischievous ne'er-do-well probably was sitting in the coffee shop you stopped at and seized the opportunity.
Without some form of countermeasures, your data isn't safe on public networks. This example is a worst-case scenario on the far end of the spectrum, but it isn't so far-fetched. There are people out there who are capable of stealing your data. The best defense is to know what you can lose, how it can get lost and how to defend against it.
What Is Packet Sniffing?
Packet sniffing, or packet analysis, is the process of capturing any data passed over the local network and looking for any information that may be useful. Most of the time, we system administrators use packet sniffing to troubleshoot network problems (like finding out why traffic is so slow in one part of the network) or to detect intrusions or compromised workstations (like a workstation that is connected to a remote machine on port 6667 continuously when you don't use IRC clients), and that is what this type of analysis originally was designed for. But, that didn't stop people from finding more creative ways to use these tools. The focus quickly moved away from its original intent—so much so that packet sniffers are considered security tools instead of network tools now.
Figure 1. A Capture of a Packet of Someone Trying to Log In to a Web Site
Finding out what someone on your network is doing on the Internet is not some arcane and mystifying talent anymore. Tools like Wireshark, Ettercap or NetworkMiner give anybody the ability to sniff network traffic with a little practice or training. These tools have become increasingly easy to use and continue to make things easier to comprehend, which makes them more usable by a broader user base.
Figure 2. Tools like NetworkMiner can reconstruct images that have been broadcast on the network.
How Does It Work?
Now, you know that these tools are out there, but how exactly do they work? First, packet sniffing is a passive technique. No one actually is attacking your computer and delving through all those files that you don't want anyone to access. It's a lot like eavesdropping. My computer is just listening in on the conversation that your computer is having with the gateway.
Typically, when people think of network traffic, they think that it goes directly from their computers to the router or switch and up to the gateway and then out to the Internet, where it routes similarly until it gets to the specified destination. This is mostly true except for one fundamental detail. Your computer isn't directly sending the data anywhere. It broadcasts the data in packets that have the destination in the header. Every node on your network (or switch) receives the packet, determines whether it is the intended recipient and then either accepts the packet or ignores it.
For example, let's say you're loading the Web page http://example.com on your computer "PC". Your computer sends the request by basically shouting "Hey! Somebody get me http://example.com!", which most nodes simply will ignore. Your switch will pass it on to where it eventually will be received by example.com, which will pass back its index page to the router, which then shouts "Hey! I have http://example.com for PC!", which again will be ignored by everyone except you. If others were on your switch with a packet sniffer, they'd receive all that traffic and be able to look at it.
Picture it like having a conversation in a bar. You can have a conversation with someone about anything, but other people are around who potentially can eavesdrop on that conversation, and although you thought the conversation was private, eavesdroppers can make use of that information in any way they see fit.
What Kind of Information Can Be Gathered?
Most of the Internet runs in plain text, which means that most of the information you look at is viewable by someone with a packet sniffer. This information ranges from the benign to the sensitive. You should take note that all of this data is vulnerable only through an unencrypted connection, so if the site you are using has some form of encryption like SSL, your data is less vulnerable.
The most devastating data, and the stuff most people are concerned with, is user credentials. Your user name and password for any given site are passed in the clear for anyone to gather. This can be especially crippling if you use the same password for all your accounts on-line. It doesn't matter how secure your bank Web site is if you use the same password for that account and for your Twitter account. Further, if you type your credit-card information into an unsecure Web page, it is just as vulnerable, although there aren't many (if any) sites that continue this practice for that exact reason.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Paranoid Penguin - Building a Secure Squid Web Proxy, Part IV
- SUSE LLC's SUSE Manager
- Google's SwiftShader Released
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- SuperTuxKart 0.9.2 Released
- Parsing an RSS News Feed with a Bash Script
- Doing for User Space What We Did for Kernel Space