The Politics of Porting
At each of the companies where I have worked in the past eight years, I have introduced Linux in some capacity. It often seemed the obvious choice, but what was obvious to me and other colleagues was not obvious to everyone. Some people needed persuading before they would accept what to them was an unconventional solution. In most cases, benchmarks and cost factors were enough to tip the balance. Novelty and openness to new ideas were sufficient for others. But, for a stubborn minority, nothing could persuade them. Although they participated in the debate, it appeared that for them discussion was simply a tactic to avoid implementation.
Faced with such opposition, arguments are pointless and demonstrations are a waste of time. In those situations, other tactics are called for. Here I describe how I persuaded my employer to port the company's flagship application to Linux literally before the opposition in management knew what was happening.
In 1998, I was UNIX and Oracle administrator for a company called Constellar Ltd. They specialised in data migration and Enterprise Application Integration (EAI) solutions for Global 2000 companies. What does that mean? At its simplest, it might mean transforming the data in a legacy mainframe database for insertion into a newly prepared database. Such tasks are by their nature repetitive and time consuming. They are ideal candidates for automation and that is where the Constellar Hub comes in.
Add a few extra offices in different parts of the world, more databases for customer data, warehouse inventories and order tracking and imagine how these assorted data stores and applications could be made to work together seamlessly. Throw in a live environment and a company that relies on accurate available data, and you begin to understand how complex the task could become.
The main application was a client server called the Constellar Hub, which was designed to extract data from disparate sources, processing, weeding and integrating it on the fly before writing or streaming the transformed data to its destination.
The server relied on an Oracle database to store data in transit, as well as the metadata in which the business rules were held, data types were defined and the relationship between the multifarious data sources and destinations were stored.
The combined transient data and metadata were held in upward of 200 database tables with many triggers and constraints to enforce the integrity of the data as it flowed through the bowels of the application. The engine of the application consisted primarily of C and SQL code, but an API provided the opportunity to extend functionality if required.
On a large project, two or many Hub servers might be required, sometimes located at remote sites around the world, feeding or being fed data through frame relay or less likely the Internet. Typically, an average project would process many gigabytes of data and usually hundreds, thousands or, more rarely, tens of thousands of business rules would be defined.
In many cases, the whole project of which the Hub was only a part—albeit a central part—would cost in the millions and involve dozens of technical and business staff.
Prospective customers would be given a relatively simple demonstration of the tool running on test data, typically on Sun hardware. The server had been ported to various other shades of UNIX, including Dynix, Data General UNIX, AIX and Digital UNIX.
It was also around that time that Windows NT was being promoted as an enterprise-ready OS destined to take over in many areas previously thought to be the exclusive domain of UNIX. Indeed, our vice president of engineering was convinced that the future was Microsoft and told us so. In the future, he said, most if not all our sales would be generated from the Windows NT version of the Hub. As a first move, they already were working on a Windows NT port and had announced the beta version in February 1998. Such a view was understandable, because the cost of a Windows NT box versus the various UNIX platforms tended to highlight only the expense of UNIX solutions. Linux was destined to change all that.
By that time, I already was using Linux on all my home PCs as well as my workplace workstation. I also followed Linux developments fairly closely, especially any that might affect the adoption of Linux in the enterprise workplace. Like many other Linux advocates, I saw the possibility of a computing future that might not be restricted to the products of a single company imposed on a captive customer base and wondered if the Constellar Hub could be ported to Linux.
There were two problems, however. First, the Hub made extensive use of Oracle databases and Oracle-specific features, and it would be a nightmare porting those dependencies to a Linux-friendly equivalent. Second, and this was a more difficult hurdle to clear, a few people in the company were not predisposed to see the opportunities presented by a Linux port.
I vaguely had suggested to one or two members of middle management that perhaps there ought to be a Linux port, but the response I received in return was along the lines that Linux is an amazing product for a bunch of amateurs to have produced, but it wasn't quite ready for the big leagues, and the company couldn't afford the time and expense of porting to it when there was serious business to deal with. I have heard that argument many times since then and have learned it is largely an excuse for not thinking.
Two things changed the nature of the ball game. First, in September 1998, Larry Ellison announced, with excellent timing, that Oracle would release a Linux version of its database server and made it free for download for development use. This had a dual effect. It demonstrated that some very large companies were taking Linux seriously, and at the same time, it made the possibility of a Linux port of the Constellar Hub feasible.
I decided that if I couldn't persuade them to port the Hub I would do it myself, so I copied the Hub source code and set about providing a proof of concept. I figured this either would get me the sack or provide a fait accompli with which those in opposition could not argue.
- The Tiny Internet Project, Part I
- Machine Learning with Python
- Download "Linux Management with Red Hat Satellite: Measuring Business Impact and ROI"
- Free Today: September Issue of Linux Journal (Retail value: $5.99)
- Bitcoin on Amazon! Sort of...
- Epiq Solutions' Sidekiq M.2
- Securing the Programmer
- Android Browser Security--What You Haven't Been Told
- Nativ Disc
Pick up any e-commerce web or mobile app today, and you’ll be holding a mashup of interconnected applications and services from a variety of different providers. For instance, when you connect to Amazon’s e-commerce app, cookies, tags and pixels that are monitored by solutions like Exact Target, BazaarVoice, Bing, Shopzilla, Liveramp and Google Tag Manager track every action you take. You’re presented with special offers and coupons based on your viewing and buying patterns. If you find something you want for your birthday, a third party manages your wish list, which you can share through multiple social- media outlets or email to a friend. When you select something to buy, you find yourself presented with similar items as kind suggestions. And when you finally check out, you’re offered the ability to pay with promo codes, gifts cards, PayPal or a variety of credit cards.Get the Guide