Welcome!

Big Data Journal Authors: Elizabeth White, Jason Bloomberg, Liz McMillan, Pat Romanski, Yeshim Deniz

Related Topics: Big Data Journal, Cloud Expo, Security, GovIT

Big Data Journal: Article

Trends in Federal Records Management

Three Principles for Successful Federal Records Management

The below is summary of my comments provided on Wednesday, January 29, 2014, at the Alfresco Content.Gov event in Washington, DC.

In my 27 years of federal service, I've watched the growth in federal records and the implementation of new executive orders and regulations aimed at improving records management across the federal space. There are immense challenges associated with litigation, review and release, tracing factual evidence for analysis, managing information legal proceedings, and overseeing a plethora of authorized and unauthorized disclosures of classified and/or sensitive information.

Federal records management professionals are true, unsung heroes in helping our nation protect information while also protecting the civil liberties and privacy of our nation's citizens. The job has become increasingly more difficult in today's era of "big data."  Records management and information management in the 1980s was hard and that's when we thought big data was hundreds of gigabytes. As we consider today's generation of data, four (4) decades later, federal records professionals are charged with managing tens of thousands of gigabytes-petabytes and zettabytes of data. It's an especially daunting task.

Three principles for records management are critical to future success for the federal space:

  1. Capture on creation;
  2. Manage and secure through the workflow; and
  3. Archive responsibly.

Point 1: Capture on Creation
The federal workforce creates content every second of every day. The content is created in formal and informal ways.  It's an email, a meeting maker, an instant message communication, a voice communication, a VTC session, PowerPoint deck, meeting minutes, collaborative engagement session, memorandum, written paper, analytic notes, and so forth.

The federal workforce stores this created content in just as many formal and informal ways.  It's stored on local hard drives, mobile phones, corporate storage, shadow IT storage, public clouds, and private clouds.

In short...it's a mess for the records management professional.

What is needed are solid systems and capabilities that demand capture on content creation.  Simplistic and non-intrusive ways to drive the creator to label information will help tremendously.  Non-intrusive doesn't mean voluntary; actions for content creation need to be forced and demanded.  Not everything is a record, but many things deserve to be preserved for after action review, lessons learned, and knowledge management training over time.

Many of today's technologies make it far too easy to create content and far too difficult to manage it in perpetuity.  Content creation with longevity in mind is critical for the federal records management professional and for the federal government in general.

Implementing technologies that work together to achieve the longevity goal is paramount. No federal agency can survive on one tool; one tool rarely meets the variety of end user needs or requirements. Discovering and implementing technologies with easy interfaces, open APIs, and purposeful data exchange bases will be most successful in the federal government. Often this equates to open source tools, which are naturally built for easy expansion and integration with other tools.

Point 2:  Manage and Secure Through the Workflow
Very little happens in the federal government without being attached to a workflow.

  • Employee time is a workflow that leads to paychecks.
  • Purchasing small and large good is a workflow that leads to vendor payments and receipt of goods.
  • Asset management is a workflow from asset need to asset receipt to asset long-term disposition.
  • Analytic products are a workflow from inception to review to edit to publish.
  • Meetings are a workflow from establishment to agenda to minutes to action capture and tracking.
  • Federal budget creation is an uber-workflow from planning, programming, budgeting, and execution.
  • Grants management is a workflow from idea submission to review to approval to tracking progress.
  • Citizen services contain many workflows for social security payments, passport processing, visa approvals, small business loans, and so forth.

Introducing solid records management to these macro and micro workflow environments is necessary and important.

The federal government needs tools that understand the intricate workflow processes and seamlessly captures the changes, approvals, and actions for the workflow throughout the entire process-from creation to retirement. A suite of tools-built on open platforms for easy data exchange-is likely to be required for any federal agency. Working through big ERP systems and through small purpose-built systems, workflow foundations can capture information necessary for approvals and for long-term retention.

Equally necessary are workflow tools that maintain data integrity, individual privacy, and agency security. The Federal Government demands absolute security in processing workflows, especially for citizen-based services that span public and private information processing environments.  It's simply not enough to have workflow tools which are fundamentally secure in a private environment. Federal agencies need confidence when exchanging data from a mobile, citizen platform to a private, agency platform.

Point 3:  Archive Responsibly
Fundamental to our form of government is trust.  Trust of our people is fundamental.  Trust by our federal workforce is fundamental. Trust in our records and information is equally fundamental. When the Administration or the Hill or the People want to know what we knew and when we knew it, federal agencies need to be at the ready to provide the truth - with facts and records to support the facts.

The Federal Government and its agencies aren't private institutions. Although there is information that we should not keep, federal agencies should continue to err on the side of caution and keep anything that seems worth keeping. We should be prepared to keep more information and more records than legally required to lend credibility and understanding of historical decisions and outcomes.

Again, we need tools and technologies that make responsible records management and archival easier for everyone. The amount of resources spent by the federal government on review and redaction of federal records is staggering. If we could have technologies to cut the resources just by 10 percent, that would be awesome. Reaching 20 or 30 percent cost reductions would be phenomenal.

Key to reducing manpower in archival, review, and release, is solid creation at that start. At the risk of creating a circular reference, I'll take you back to my initial point of Content Management at Creation.

Summary

  • Federal agencies create more data and content than any of us cares to understand.
  • It's not all useful data and finding our way through the mountains of data to know and keep what's important is a tough job.
  • Securing the data to prevent harmful use and unlawful disclosure needs to be easier for federal agencies.
  • Knowing when a leak is harmful also needs to be easier for federal agencies.
  • Responding to appropriate releases of information-whether through freedom of information act requests or congressional inquiries-shouldn't be as hard as it is today.
  • Guaranteeing the safety and security of private citizen data isn't a desire...it's a demand.
  • The basic needs for federal agencies are:
    • Suites of tools that do a large amount of the content management;
    • Open interfaces and open source tools that allow affordable and extensible add-ons for special purposes;
    • Tools that facilitate reduced complexity for end users and IT departments; and
    • Tools that make a records management professional and an end user's job easier on a day-to-day basis.

More Stories By Jill Tummler Singer

Jill Tummler Singer is CIO for the National Reconnaissance Office (NRO)- which as part of the 16-member Intelligence Community plays a primary role in achieving information superiority for the U.S. Government and Armed Forces. A DoD agency, the NRO is staffed by DoD and CIA personnel. It is funded through the National Reconnaissance Program, part of the National Foreign Intelligence Program.

Prior to joining the NRO, Singer was Deputy CIO at the Central Intelligence Agency (CIA), where she was responsible for ensuring CIA had the information, technology, and infrastructure necessary to effectively execute its missions. Prior to her appointment as Deputy CIO, she served as the Director of the Diplomatic Telecommunications Service (DTS), United States Department of State, and was responsible for global network services to US foreign missions.

Singer has served in several senior leadership positions within the Federal Government. She was the head of Systems Engineering, Architecture, and Planning for CIA's global infrastructure organization. She served as the Director of Architecture and Implementation for the Intelligence Community CIO and pioneered the technology and management concepts that are the basis for multi-agency secure collaboration. She also served within CIA’s Directorate of Science and Technology.

Cloud Expo Latest Stories
SYS-CON Events announced today that TechXtend (formerly Programmer’s Paradise), a leading value-added provider of server and storage virtualization, and r-evolution will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. TechXtend (formerly Programmer’s Paradise) is a leading value-added provider of software, systems and solutions for corporations, government organizations, and academic institutions across the United States and Canada. TechXtend is the Exclusive Reseller in the United States for r-evolution
Every healthy ecosystem is diverse. This is especially true in cloud ecosystems, where portability and interoperability are more important than old enterprise models of proprietary ownership. In his session at 15th Cloud Expo, Mark Baker, Server Product Manager at Canonical/Ubuntu, will discuss how single vendors used to take the lead in creating and delivering technology, but in a cloud economy, where users want tools of their preference, when and where they need them, it makes no sense.
The consumption economy is here and so are cloud applications and solutions that offer more than subscription and flat fee models and at the same time are available on a pure consumption model, which not only reduces IT spend but also lowers infrastructure costs, and offers ease of use and availability. In their session at 15th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss this shifting dynamic with an example of a top European Telco provider. Find out how they are leveraging the power of acloud-based consumption model services to offer more value to the mass market and enable a new revenue model that embraces the true meaning of the Third Industrial Revolution.
The emergence of cloud computing and Big Data warrants a greater role for the PMO to successfully manage enterprise transformation driven by these powerful trends. As the adoption of cloud-based services continues to grow, a governance model is needed to orchestrate enterprise cloud implementations and harness the power of Big Data analytics. In his session at 15th Cloud Expo, Mahesh Singh, President of BigData, Inc., to discuss how the Enterprise PMO takes center stage not only in developing the appropriate governance model but also in collaborating with key stakeholders to ensure a successful transformation.
SYS-CON Events announced today that Cloudian, Inc., the leading provider of hybrid cloud storage solutions, has been named “Bronze Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Cloudian is a Foster City, Calif.-based software company specializing in cloud storage. Cloudian HyperStore® is an S3-compatible cloud object storage platform that enables service providers and enterprises to build reliable, affordable and scalable hybrid cloud storage solutions. Cloudian actively partners with leading cloud computing environments including Amazon Web Services, Citrix Cloud Platform, Apache CloudStack, OpenStack and the vast ecosystem of S3 compatible tools and applications. Cloudian's customers include Vodafone, Nextel, NTT, Nifty, and LunaCloud. The company has additional offices in China and Japan.
In today's application economy, enterprise organizations realize that it's their applications that are the heart and soul of their business. If their application users have a bad experience, their revenue and reputation are at stake. In his session at 15th Cloud Expo, Anand Akela, Senior Director of Product Marketing for Application Performance Management at CA Technologies, will discuss how a user-centric Application Performance Management solution can help inspire your users with every application transaction.
Come learn about what you need to consider when moving your data to the cloud. In her session at 15th Cloud Expo, Skyla Loomis, a Program Director of Cloudant Development at Cloudant, will discuss the security, performance, and operational implications of keeping your data on premise, moving it to the cloud, or taking a hybrid approach. She will use real customer examples to illustrate the tradeoffs, key decision points, and how to be successful with a cloud or hybrid cloud solution.
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, will explore the synergies in these two approaches, with practical tips, techniques, research data, war stories, case studies, and recommendations.
The 16th International Cloud Expo announces that its Call for Papers is now open. 16th International Cloud Expo, to be held June 9–11, 2015, at the Javits Center in New York City brings together Cloud Computing, APM, APIs, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
14th International Cloud Expo, held on June 10–12, 2014 at the Javits Center in New York City, featured three content-packed days with a rich array of sessions about the business and technical value of cloud computing, Internet of Things, Big Data, and DevOps led by exceptional speakers from every sector of the IT ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore’s Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at 15th Cloud Expo, Mason Katz, CTO and co-founder of StackIQ, to discuss how infrastructure teams should be aware of the capitalization and depreciation model of these expenses to fully understand when and where automation is critical.
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects – scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e., the largest single medical system in the United States.
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services. Speaker Bio: Mark Hinkle is the Senior Director, Open Source Solutions, at Citrix Systems Inc. He joined Citrix as a result of their July 2011 acquisition of Cloud.com where he was their Vice President of Community. He is currently responsible for Citrix open source efforts around the open source cloud computing platform, Apache CloudStack and the Xen Hypervisor. Previously he was the VP of Community at Zenoss Inc., a producer of the open source application, server, and network management software, where he grew the Zenoss Core project to over 10...
Most of today’s hardware manufacturers are building servers with at least one SATA Port, but not every systems engineer utilizes them. This is considered a loss in the game of maximizing potential storage space in a fixed unit. The SATADOM Series was created by Innodisk as a high-performance, small form factor boot drive with low power consumption to be plugged into the unused SATA port on your server board as an alternative to hard drive or USB boot-up. Built for 1U systems, this powerful device is smaller than a one dollar coin, and frees up otherwise dead space on your motherboard. To meet the requirements of tomorrow’s cloud hardware, Innodisk invested internal R&D resources to develop our SATA III series of products. The SATA III SATADOM boasts 500/180MBs R/W Speeds respectively, or double R/W Speed of SATA II products.
As more applications and services move "to the cloud" (public or on-premise) cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. NuoDB is involved in both building enterprise software and using enterprise cloud capabilities. In his session at 15th Cloud Expo, Seth Proctor, CTO at NuoDB, Inc., will discuss the experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.