Browse: Home / 2011 / September / 06 / FedTech Soundoff
November 17, 2021
Header image

GovCon ExecLogo

Private-Public Connections for Contract Leadership

Menu

Skip to content
  • Home
  • About
  • Wash 100
  • Advertise
  • Posts from ExecutiveBiz for 01/04/2019
  • Posts from GovConWire for 01/04/2019
  • Posts from GovConDaily for 01/03/2019
  • Posts from ExecutiveBiz for 01/03/2019
  • Posts from GovConDaily for 01/02/2019
  • Posts from GovConWire for 01/03/2019
  • Posts from ExecutiveBiz for 01/02/2019
  • Posts from GovConWire for 01/02/2019
  • Posts from GovConDaily for 12/28/2018
  • Posts from ExecutiveBiz for 12/28/2018

FedTech Soundoff

Posted by David Barton on September 6, 2011 in Resources | 62 Views

Cloud exit plans
As more government agencies and private-sector enterprises begin their transition to the cloud, recent research indicates that very few have a viable exit strategy in place or have even considered what will happen in the event of potential pitfalls. The following industry experts explain why it’s important for organizations to have an exit plan in place before they migrate to the cloud.

Dmitry Kagansky, chief technologist Quest Software Public Sector Amid all the activity surrounding cloud computing, some federal agencies are still struggling with the execution of a sustainable long-term cloud computing strategy.

According to a survey commissioned by Quest Software Public Sector, 90.3 percent of federal IT professionals said they don’t have or are unaware if they have a cloud computing exit strategy.

There are a number of highly strategic determinations that agencies need to make to establish a successful cloud computing strategy. A well thought-out cloud computing strategy provides public-sector organizations with the ability to avoid vendor lock-in and retrieve their assets without disruption in operations or loss of data.

Without a viable overall strategy, which includes a backup plan, agencies are running the risk of vendor lock-in, which might result in violations in the areas of security, identity management and data ownership. Other potentials include a loss of IT functionality and higher maintenance cost, which could eventually negate the benefits of moving to the cloud. Thus, adopting vendor-agnostic tools and best practices is the foundation of reliable cloud architecture.

First and foremost, organizations need to focus on shoring up their identity and access management architecture. This involves configuring proper data access in the cloud and ensuring full visibility into who has access to and control of information and services. This should also take into account data access provisioning and management. To secure data in the cloud and avoid third-party control of their assets, agencies can choose to keep cloud service management in-house if they are adopting a public cloud model. Many government and commercial enterprises are viewing this as an essential foundation for safely entering a cloud architecture.

Rick White, chief technical officer and VP of solutions, Wyle Information Systems
The time to start planning a cloud exit strategy is not after you have implemented cloud services, but rather as part of your overall cloud strategy and go/no-go decision process. As with any technology insertion or migration, you should have a back-out plan. As part of your technical planning and cost/benefit analysis ask yourself and your proposed service provider the following questions: What type of cloud services are we invoking and how will we exit?

Exiting from an Infrastructure as a Service will have different issues than either platform or software as a service because of the degree of control over the data. Hosting platforms and application services on a service provider infrastructure (e.g. data center consolidation) is well understood. Developing and deploying web/SOA-based applications on hosted infrastructure with platforms or software as a service presents new challenges for moving applications, databases and data. Discuss your exit strategy with service provider and have them answer these questions for you.

What would compel me to change cloud service providers?
Identify the key factors that would drive a decision to change service providers to bring the service back in house and specify metrics needed to support the decision process. Review these factors to second and third order dependencies to be sure you fully examine the implications of your decision. Develop a cost model that accounts for tangible and intangible factors (e.g. lost productivity). Have you already taken financial credit for savings in your out-year budgets?

What risks will we encounter in moving applications and data back to our control or to another service provider?
Proprietary data formats and structures employed by the service provider may create a barrier to switching (vendor lock-in). At the end of the day, data is everything to your organization and the ability to access and move data where you want, when you want cannot be hindered or blocked by proprietary solutions. The cost and challenge of data migration has to be clearly understood so we have a basis to understand what it will cost to exit. Take email archiving for example. If you’re required to maintain email archive as part of federal regulation, what does that mean to you if you decide to switch providers?

What are the unintended consequences of application dependencies we’ve created?
Some of the costs of switching can be easily hidden in the interdependencies you develop between applications (e.g. email notification, alerts and workflow). Understanding every internal decision to link your local and SaaS solutions is to recognize the hidden costs you’re introducing when it comes time to migrate to a new provider.

At the end of the day, a cloud exit strategy will likely be treated like any decision to transition, but it can be a lot less painful if you plan for it before you take the first leap.

Raymond Roberts, CEO, Citizant
Top military leaders know that you should never move into a new operating theater without a plan for getting out. This wisdom is no less true in the brave new world of cloud computing.

Cloud computing offers major benefits to federal agencies that want to modernize their applications, consolidate data centers, share information and ensure scalability and high availability of IT systems. But the cloud also brings risks. As time, leadership and events evolve, so may policies. For example, the domestic and international rules for data ownership and access may change as European commissioners challenge whether the U.S. Patriot Act supersedes the European Union’s Data Protection Directive.

There are early indications that the strategy of centralizing computing resources in big, power-hungry data centers — the current home of the cloud — may not last forever. According to a recent radio report, Microsoft is already testing lightweight, portable data centers and theorizing com that servers can be built into mobile gadgets and laptops in a highly enmeshed, high-bandwidth network.

CTOs and CIOS inherently have the role of chief risk officers. They must ensure they can respond to these risks, up to and including a partial or complete move out of the cloud altogether, should the need arise.

Having worked closely with federal agencies for many years on enterprise architecture projects, which increasingly have involved migrating applications and data sources to the cloud while meeting current and expected regulatory and statutory requirements, our technologists have found that best practices in developing solid cloud exit strategies need to vary widely. This is an especially critical challenge in the law enforcement and national security mission areas, in which many of our customers operate.

The specifics of a cloud exit plan must always be tailored to the agency, its mission, applicable laws, business processes, data governance and many other issues. But the wisdom of creating a withdrawal plan — likely for each application and data set independently — before you migrate to the cloud, is universal.

Mobility
The rapid integration of mobile devices into the everyday operations of government agencies has presented the contracting industry with countless opportunities. The growing demand for new technologies, applications and innovations has not only led to greater cooperation between the private sector and its federal customers, but the evolving mobile market has raised the bar for competitors looking to impact the industry. As with every new technology, however, mobile devices have brought about new vulnerabilities to federal infrastructures. It’s now up to contractors to equip the government with the necessary defenses against cyber threats.

John Bordwine, chief technology officer, public sector, Symantec
Among the various challenges that government agencies are facing is the increased use of mobile devices in their organizations. It’s understandable.

Employees and workers simply want to do their job in the fastest, most convenient way, and the ready access provided by mobile devices allows them to do that.

However, speed and convenience often increase security risks, particularly with regards to mobile threats. At the annual Symantec Government Symposium, we polled 195 of the public sector’s top IT decision makers about the proliferation of mobile devices, and the picture they painted was stark: Nearly half of federal employees and contractors admit to accessing sensitive or confidential information from their mobile devices, and yet one in three reports no authentication or encryption whatsoever on the mobile platform.

Perhaps even more alarming is one in three respondents admits using personal mobile devices for work purposes and half admits using work-issued mobile devices for personal purposes. We call this ‘cross-contamination,’ and it’s a significant problem — particularly considering fewer than one in 10 reports having a mobile device management system that includes personal devices.

Undoubtedly, addressing the challenges posed by an explosion of government mobility (sometimes referred to as the ‘consumerization of IT’) is a long game and one that cannot be accomplished through fast, reactive measures. But the situation is far from hopeless.

In the near term, fundamental improvements can be made to mobile policy guidance, employee training and centralized device management measures like authentication, encryption and backup. In the longer term, we advocate a comprehensive migration from system-centric to information-centric security operations, which means a shift away from structured, centralized perimeter-defensed data toward unstructured, distributed, self-secured data, no matter what type of device is accessing the information.

People are the new perimeter, and we must plan to protect sensitive information whenever and wherever it can be accessed — in rest or in motion — through the coordinated protection of identity, device, information and infrastructure. But the first step, as always, is simply understanding the problem.

Dr. Phyllis A. Schneck, chief technology officer, McAfee Security
As more federal agencies equip their employees with smartphones and other mobile devices, mobile security considerations are of chief importance for the government’s IT decision makers. Before implementing mobile platforms such as smartphones and tablets, agencies balance the needs of the mission with overall risk potentially incurred.

Mobile platforms introduce new vectors of exposure. For example, location data are now correlated with other information resident on a device, potentially allowing an adversary to track not just a device, but a person. A sound assessment of risk guiding the investment in cybersecurity for mobile devices allows government IT decision makers to capitalize on the efficiencies and conveniences of mobile platforms while creating a resilient infrastructure that is cognizant of the exposures as well.

In addition to location, mobile devices and trends in app development and use introduce a new risk. Recently, the McAfee Threats Report: First Quarter 2011 found that Android was the second most popular platform for mobile malware this quarter and may well be the system with the most malware by the end of the year. Android’s vulnerabilities can in part be attributed to Google’s open system, which allows individuals to create and post apps without subjecting them to a vetting process. Conversely, Apple, whose operating system has historically has been the least likely subject for malware, has a highly regimented review process in which all apps must conform to stringent guidelines before earning a spot in their store. A key take-home point is that applications are a source of risk.

The great news is that mobile platforms make life and work easier and better in many ways for all. Government agencies can benefit from mobile infrastructure by assessing their exposure, tolerable risk and fitting that with a mobile solution that provides the correct balance. Simple things such as central management of devices, control of application downloads and sound, centrally managed policy governance of settings, such as location, can create this balance.

It is important for federal agency IT decision makers to recognize that while some mobile platforms may be a better fit for an organization, all operating system can have malware written for them, and can be attacked and rooted. That said, once a device has been chosen, it is imperative that all federal government users are extensively trained on proper use of the phone. Agency IT departments need to work with human resource departments to take employees through hands-on, behavioral training.

We have not even begun to see the tremendous benefits of mobile computing. Balancing risk in decisions here will enable the mission of government agencies and create new efficiencies and conveniences for those who carry that mission. Teaching users about their device, their applications and their settings is the key to allowing them to identify issues and protect themselves against these problems.

In the end, it is user behavior, beyond certain security programs, which makes users far safer and ultimately protects an agency’s data.

Dr. Stephen Cambone, executive vice president, strategic development, QinetiQ North America
There’s no doubt that the penetration and use of personal devices, such as smartphones and tablets, in the public sector space is exploding. We’ve seen some studies that show approximately 70 percent of employees are using their own phones for work-related activities. Today, the federal government stands squarely in the path of this unstoppable train.

Government leaders are starting to recognize and appreciate the value that these smartphones and tablets bring to the table. They see that these devices can increase productivity, and there is even talk of leveraging smartphones on the battlefield. Inevitably, however, as personal devices proliferate the government space, the government must play a bigger role in controlling smartphone protocols due to security issues.

As with any other network access point, the government must safeguard proprietary data sets and address major security implications related to smartphones and tablets. The technology to secure these devices and the data they hold lags behind the ability to exploit these mobile systems that provide convenient access to rich application functionality. The Defense Department is aware of these security risks and is working with vendors via their Information Assurance Certification and Accreditation Process.

QinetiQ North America has taken a unique 360-degree view of mobile security that is aligned with DIACAP. This is supported by a variety of application-level encryption and data storage policies, as well as adequate authorization, authentication and accounting systems. Such an approach addresses the risks involved in using commercial networks and mobile devices through additional, active cyber-intelligence and threat-sensing systems. This 360-degree approach makes it possible for SOUNDOFF the rapid deployment of smartphones and tablets through active defense of people, data and assets.

It is now up to industry to provide the highest levels of security and address the risks posed by government’s use of smartphones. If we fail to take the lead, government will never realize the full power and advantages that smart phones can provide.

Health IT
The Office of the National Coordinator for Health IT — known as the ONC — has been developing “meaningful use” requirements for healthcare providers to begin adopting electronic health records, or EHRs, to receive incentive payments. In other words, ONC will open the purse strings for healthcare providers who can show they meaningfully use health IT methods. The following private-sector health IT experts sound off on meaningful-use deadlines and the role of the private sector in helping ONC and healthcare providers begin implementing now.

Jim Traficant, president, Harris Healthcare Solutions
Meaningful use is more than a series of government-mandated requirements for physicians and healthcare organizations. Instead, MU is an authentic, long-term pathway to healthcare transformation and top clinical, operational and financial performance — the kind of transformation the nation is looking for.
The multistage journey toward MU will result in creative solutions to address longstanding healthcare problems — from fragmentation, duplication and medical errors, to uneven quality, runaway costs and restricted access. But MU also sets the stage for accountable, value-based healthcare by focusing on functions like care coordination and collaboration, health data exchange and patient engagement. Through open-source technologies like CONNECT, Harris has been able to free up data from isolated health systems that were not designed to work together and pull the information from where it was created to where it’s needed for a seamless clinician experience.

To develop and participate in an Accountable Care Organization — or ACO — providers must coordinate care, exchange clinical and administrative data, and engage patients with accurate health information. These are the key ingredients for transforming healthcare.

Providers may talk about technology strategy, but what they really need are technology solutions that fulfill MU and embrace emerging trends toward high-performance, value-based accountable care. The ideal technology solutions will achieve interoperability, close gaps in care, ease care transitions, enhance information access and integration, improve workflow and facilitate more intelligent business, clinical and financial decision-making.

Mary Whitley, senior vice president in health information and health technology services, ICF International
As part of the HITECH Act of 2009, our nation’s healthcare providers have been given the challenging goal of aiming at a fluid target with a meandering trajectory called meaningful use. The aims of MU are desirable in terms of anticipated improvements in quality of care, efficiency and cost reduction, but also quite ambitious.

The health IT challenges may include implementation schedules and costs that far exceed expectations and incentives to offset them. In addition, change management and human-factors issues such as data consistency, user workflow and usability, including provider adoption resistance, are not solved through technology alone. Recognition of realities such as these has led to re-evaluation as we now enter into the stage 2 of MU that was planned to be implemented in 2013. For example, stage 2 enhancements seek increased patient engagement and include important population-health and public-health requirements and implications.

ICF International is currently addressing these challenges for its clients, including its advisory work for the Agency for Healthcare Research and Quality’s assessment of health IT measures as well as implementation of comparative effectiveness research features in enhanced cancer registries for the Centers for Disease Control and Prevention. Projects such as these put ICF’s expertise and perspectives at the heart of the MU initiative.

Keith Salzman, chief medical information officer, Transformations Solutions Group, CACI International
The stage 2 goals for meaningful use are currently in review and reflect dealing with the challenges that were identified with the release and revision of stage 1 goals. The capabilities and standards driving MU are sound and, where they are in place and used, have delivered improved care. The challenge is to enable a critical mass of users to use the capabilities (the implication is they have an EHR platform as well as the available tools) to impact the workflow in their organization and realize the benefit for patients and providers.

ONC must be both proscriptive to provide a direction and an enabler, meeting the goals with policy and infrastructure without being obstructive or unrealistic regarding goals. Their responses to stage 1 experience reflect acknowledgement of the latter (unrealistic expectations) though the enablers and a clear statement of standards is critical to establish tangible solutions that are pervasive and functioning to deliver on the promise of improved outcomes.

CACI, as well as other integrators, contractors and vendors will help the most by focusing on the basic infrastructure propagation, without which all the functionality and capabilities will be moot.

 

 

Posted in Resources | Tagged Android, CACI International, Citizant, cloud, Cloud Computing, DIACAP, Dmitry Kagansky, google, Harris Healthcare Solutions, ICF International, Jim Traficant, John Bordwine, Keith Salzman, malware, Mary Whitley, McAfee Security, Mobility, Phyllis A. Schneck, QinetiQ North America, Quest Software Public Sector Amid, Raymond Roberts, Rick White, Service, SOUNDOFF, Stephen Cambone, Symantec, Symantec Governmetn Symposium, Transformations Solutions Group, transition, U.S. Patriot Act, Wyle Information Systems

About the Author

David Barton

Related Posts

Mark Johnson, Oracle Public Sector SVP, Inducted Into 2018 Wash100 for His Leadership in Driving Federal Cloud Adoption→

Lisa Mascolo, IBM Global Business Sector’s Managing Director, Inducted Into 2018 Wash100 for Cloud & Analytics Service Excellence→

John Goodman, CEO of Accenture Federal Services, Inducted Into 2018 Wash100 for His Leadership in Expanding Cyber Defense Services→

Dave Levy, AWS Federal Government VP, Named to 2018 Wash100 for Leadership in Customer Relations and Cloud Contract Negotiation→

Click here to sign up for all of our E-NewsWires

Subscribe to all our E-NewsWires

Subscriber Details

Upcoming POC Event

Sponsor Ad

©2015 GovCon Exec

Menu

  • Authors
  • Sitemap
  • Contact Us
Show Buttons
Hide Buttons