Linux in Government: Federated Identity Management Business Drivers
In last week's article, we discussed federated identity management (FIM) to get you familiar with some of the concepts of how it worked. We also stressed the need for Linux practitioners to start preparing for the emergence of new products and services requiring FIM. In this week's discussion, we explain reasons why identity management has become required in many organizations.
Homeland Security Presidential Directive (HSPD) 12, dated August 27, 2004, established a policy for a common identification standard for federal employees and contractors. In the directive, the White House established these talking points:
Wide variations in identification technology people use to gain access to secure facilities exist and need to be eliminated.
The Secretary of Commerce needs to be responsible for setting a standard for appropriate identification within 6 months.
The heads of executive departments and agencies will have a program in place 4 to 8 months following the standard.
Within 6 months and 7 months following the the Standard, the Assistant to the President for Homeland Security and the Director of OMB will recommend additional technology.
The Assistant to the President for Homeland Security will report within 7 months after the Standard on the progress implementing HSPD 12.
In response to HSPD 12, the National Institutes of Standards and Technology (NIST) Computer Security Division initiated a new project for improving the identification and authentication of federal employees and contractors for access to federal facilities and information systems. Federal Information Processing Standard (FIPS) 201 started the clock for agencies to implement common smart card-based ID cards, among other identity management procedures.
FIPS 201 lays out the technical and operational requirements for the system and card. HSPD 12 requires agencies to have their access systems in place, "to the maximum extent practicable", by October 25, 2005.
Some people feel that meeting that deadline is likely to be a challenge. Although NIST is not responsible for implementing the standard, Jim Dray of NIST stated, "I don't think it's going to be possible for most agencies to continue doing business as usual and comply." People at the Office of Management and the Budget (OMB) remain optimistic.
The main commercial Linux vendors may wind up providing infrastructure and provisioning to the various agencies that must meet the standard of FIPS 201 and related documents. You could say that the President of the United States created a sense of urgency in the federated identity management sector by suggesting that wide variations in identification technology people use to gain access to secure facilities exist and need to be eliminated.
That's the essence of Red Hat's entry into this market. For more information on Red Hat's product, take a look at its product page.
The new FIPS 201 standard requires replacing the former Government Smart Card Interoperability Specification (GSCIS). The new standard requires DOD, for example, to re-deploy applications on 2.2 million computers and update 3.5 million Common Access cards. And, that's only one implementation.
With all of the scrambling to comply with the President's standard, many vendors find themselves scrambling to help agencies meet their deadlines. You can count on IBM and its partners Red Hat and SUSE to benefit from those efforts.
In addition to FIPS 201, other federal regulations have created a need for identity management. Again, with IBM having a significant lead in the market, Linux will see its share of business. Let's take a look at the primary drivers in the compliance area.
Healthcare Insurance Portability and Accountability Act (HIPAA)
HIPAA regulations provide for the protection of healthcare information. Control of access to information systems has become big business in the health care industry. Fines of up to $100,000 and prison terms of up to five years for noncompliance make HIPAA compliance a big concern.
HIPAA regulations affect business processes, information systems operations and information systems sharing. HIPAA-compliant privacy and security features require structured identity management solutions that we have seen in products such as IBM's Tivoli Access Manager, which runs on Linux and interoperates with a variety of other software platforms.
HIPAA regulations impose requirements to enforce formal security policies and procedures for granting different levels of access to patient information.
Gramm-Leach-Bliley regulations became effective on February 1, 2001. The US Treasury Department issued guidelines interpreting the privacy and security requirements contained in the GLB Act of 1999, also known as the Financial Modernization Act of 1999.
The GLB exists primarily to repeal restrictions on banks affiliated with securities firms. It requires financial institutions--including preparers of income tax returns, consumer credit reporting agencies, real estate transaction settlement services and debt collection agencies--to adopt privacy measures relating to customer data.
The legislation eliminated legal barriers to affiliations among banks and securities firms, insurance companies and other financial services companies. Such affiliations require legal and security safeguards. The Federal Deposit Insurance Corporation (FDIC), Federal Reserve System (FRS), Federal Trade Commission (FTC), Securities and Exchange Commission (SEC), National Credit Union Administration (NCUA), Office of the Comptroller of the Currency (OCC) and the Office of Thrift Supervision all regulate some area of Gramm-Leach-Bliley.
The Sarbanes-Oxley Act of 2002 has created numerous logistical, operational and economic challenges for public companies. Sarbox requires CEOs and CFOs of public companies to swear under oath that the financial statements they publish are accurate and complete. This is supposed to protect investors by improving the reliability of corporate financial statements. It imposes stiff penalties for auditors, corporate officers, company directors and others who violate the Act. Every publicly traded company registered under the Exchange Act or that has a pending registration statement under the Securities Act of 1933 falls under the regulations.
If someone fails to comply with Sarbox, he or she can expect stiff penalties, including jail terms for executives. New processes and procedures to ensure compliance may improve efforts to implement identity management and automate many of those processes.
Identity management technology helps automate processes that enable Sarbox compliance. For example, it addresses security processes associated with establishing "adequate internal controls" around financial reporting. By mapping these processes as well as internal security policies to automated identity management, companies can utilize frameworks for improving security and ensuring compliance.
|Speed Up Your Web Site with Varnish||Jun 19, 2013|
|Non-Linux FOSS: libnotify, OS X Style||Jun 18, 2013|
|Containers—Not Virtual Machines—Are the Future Cloud||Jun 17, 2013|
|Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer||Jun 12, 2013|
|Weechat, Irssi's Little Brother||Jun 11, 2013|
|One Tail Just Isn't Enough||Jun 07, 2013|
- Speed Up Your Web Site with Varnish
- Containers—Not Virtual Machines—Are the Future Cloud
- Linux Systems Administrator
- Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer
- Senior Perl Developer
- Technical Support Rep
- RSS Feeds
- Non-Linux FOSS: libnotify, OS X Style
- UX Designer
- So when they found it hard to
51 min 9 sec ago
1 hour 13 min ago
- Reply to comment | Linux Journal
1 hour 35 min ago
- Android has been dominating
1 hour 40 min ago
- It is quiet helping
4 hours 25 min ago
4 hours 43 min ago
- Reachli - Amplifying your
5 hours 59 min ago
6 hours 48 min ago
- good point!
6 hours 51 min ago
- Varnish works!
7 hours 9 sec ago
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?