New Projects - Fresh from the Labs
Starting off this month we have Bombono: a simplistic DVD authoring program that doesn't have the steep learning curve of many others in its field. According to the Web site: “Bombono DVD is a DVD authoring program for Linux. It is easy to use and has nice and clean GUI (Gtk).”
Also from the Web site, the main features of Bombono DVD are:
Excellent MPEG viewer: timeline and monitor.
Real WYSIWYG menu editor with live thumbnails.
Comfortable drag-and-drop support.
You can author to folder, make ISO images or burn directly to DVD.
Reauthoring: you can import video from DVD discs.
Binaries are available for Ubuntu, SUSE, ALT Linux and Arch Linux, along with the usual source tarball. For those compiling from source, there are some pretty stringent library requirements. The documentation lists the following:
gtk+ >= 2.8 (www.gtk.org)
gtkmm >= 2.4 (www.gtkmm.org)
SCons >= 0.96.1 (www.scons.org)
GraphicsMagick >= 1.1.7 (www.graphicsmagick.org)
mjpegtools >= 1.8.0 (mjpeg.sourceforge.net)
In terms of packages I needed to install on my own Kubuntu system (obviously some were already in place), I needed: scons, libglibmm-2.4-dev, libxml++2.6-dev, libmjpegtools-dev, graphicsmagick, libgraphicsmagick1, libgraphicsmagick++1, libgraphicsmagick++1-dev and libgtkmm-2.4-dev.
Head to the Web site, grab the latest tarball, extract it and open a terminal in the new folder. Enter:
If your distribution uses sudo, enter:
$ sudo scons install
If not, enter:
$ su # scons install
Once the compilation has finished, you will find it in your menu, or you can start Bombono with the command:
A quick note before we jump in: you can use only MPEG-2 .vob files. It's a bummer, I know, so if you have files such as some DivXes you want to include, you're going to have to convert them first. Hopefully, future releases will support DivX, Xvid, MPEG-4 and so on, but for now, you'll have to make do with just .vob support.
Once you have some .vob files available, start the program, click on the Source tab, and look at the file browser on your left. Locate the files you're going to use, and either drag them across into the Media List pane or click the blue + sign. While we're in the source section, clicking the Edit button when a video is highlighted lets you split the file into chapters using the timeline below (more on that later).
For now, let's move on to the menu tab. Click the + sign in the Menu List pane to create a new menu. Then, you can add a menu object to link to a video along with some accompanying text. To add a menu object, choose the shape of the object you want to add and click the + sign next to it. Once the object has been made, you can move it around the screen or resize it. Now, let's make a link to the video you want to run from the menu object.
Right-click on the object and choose Link→(name of video). Note that if you've edited your videos to include chapters, these chapters can be linked to also, but I don't have the space to cover that here.
Now, let's add some text. You'll be in the standard mode for manipulating the menu objects as signified by the highlighted mouse pointer, but press the T button (as in T for text), and you'll be ready to go. Click next to an object and you'll see a blinking cursor, ready for you to start typing. If you don't like the font size or color, you can change them in the above menu.
With a menu out of the way, let's get back to editing—more specifically, making chapters. Return to the Source tab, and make your way back to the Media List pane. Select the file to which you want to add chapters, and click the green Edit button. Your file now will load up in the big timeline below.
It appears that when you're editing for chapters, if it's not evident where you are from the still image on-screen, you need to play the video in a separate player and take note of what time each point is, as the video does not seem to play in the window itself (although I could be wrong and missing something obvious). However, I found this wasn't really a Herculean task, so it shouldn't be much of a worry.
In the timeline section, the top slider is for browsing around inside the video, where the on-screen image will update depending on the position in time. If you look to the left, the strong blue digital readout will give you the exact time the slider is sitting on. To mark out a chapter, click the blue button underneath the time readout (I'll call it a Chapter Marker), and a Chapter Point will be made under the slider, marked with the same icon as the Chapter Marker. If you want to fine-tune this position at all, you also can slide around the Chapter Point, and if you've made one accidentally, you can right-click and choose Delete Chapter Point.
When you're happy with your soon-to-be DVD, head to the Output tab. Here, you can choose either to write a DVD Folder or make a disk image on hard drive, or you can just burn the project straight to DVD.
Of course, this project has some hurdles to overcome before it's truly ready for the mainstream. The most pressing issue is that you can use only .vob files for now. This is reasonable enough, but when given the often tricky task of converting files, most lazy people like myself are going to throw the whole job in the too-hard basket and go back to playing Half-Life. It'll be truly ready only when you can add almost any video file. This, of course, will require some probable structural changes to the design and coding—perhaps adding a video conversion stage prior to burning—but it still will be necessary.
Nevertheless, the authors have taken a good approach that I respect—keeping it tight and simple to begin with and working properly with the elements they do have, instead of creating an unstable mess with lots of features. This project is simplistic and highly satisfying, and it probably will become a distro mainstay once it reaches fruition. I'm looking forward to the finished product.
John Knight is the New Projects columnist for Linux Journal.
|Designing Electronics with Linux||May 22, 2013|
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
- Nice article, thanks for the
8 min 32 sec ago
- I once had a better way I
5 hours 54 min ago
- Not only you I too assumed
6 hours 11 min ago
- another very interesting
8 hours 4 min ago
- Reply to comment | Linux Journal
9 hours 58 min ago
- Reply to comment | Linux Journal
16 hours 52 min ago
- Reply to comment | Linux Journal
17 hours 8 min ago
- Favorite (and easily brute-forced) pw's
18 hours 59 min ago
- Have you tried Boxen? It's a
1 day 51 min ago
- seo services in india
1 day 5 hours ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?