Click here to close now.

Welcome!

Big Data Journal Authors: JP Morgenthal, Liz McMillan, Kevin Jackson, Elizabeth White, Ed Featherston

Related Topics: Big Data Journal, Cloud Expo, Security, GovIT

Big Data Journal: Article

Trends in Federal Records Management

Three Principles for Successful Federal Records Management

The below is summary of my comments provided on Wednesday, January 29, 2014, at the Alfresco Content.Gov event in Washington, DC.

In my 27 years of federal service, I've watched the growth in federal records and the implementation of new executive orders and regulations aimed at improving records management across the federal space. There are immense challenges associated with litigation, review and release, tracing factual evidence for analysis, managing information legal proceedings, and overseeing a plethora of authorized and unauthorized disclosures of classified and/or sensitive information.

Federal records management professionals are true, unsung heroes in helping our nation protect information while also protecting the civil liberties and privacy of our nation's citizens. The job has become increasingly more difficult in today's era of "big data."  Records management and information management in the 1980s was hard and that's when we thought big data was hundreds of gigabytes. As we consider today's generation of data, four (4) decades later, federal records professionals are charged with managing tens of thousands of gigabytes-petabytes and zettabytes of data. It's an especially daunting task.

Three principles for records management are critical to future success for the federal space:

  1. Capture on creation;
  2. Manage and secure through the workflow; and
  3. Archive responsibly.

Point 1: Capture on Creation
The federal workforce creates content every second of every day. The content is created in formal and informal ways.  It's an email, a meeting maker, an instant message communication, a voice communication, a VTC session, PowerPoint deck, meeting minutes, collaborative engagement session, memorandum, written paper, analytic notes, and so forth.

The federal workforce stores this created content in just as many formal and informal ways.  It's stored on local hard drives, mobile phones, corporate storage, shadow IT storage, public clouds, and private clouds.

In short...it's a mess for the records management professional.

What is needed are solid systems and capabilities that demand capture on content creation.  Simplistic and non-intrusive ways to drive the creator to label information will help tremendously.  Non-intrusive doesn't mean voluntary; actions for content creation need to be forced and demanded.  Not everything is a record, but many things deserve to be preserved for after action review, lessons learned, and knowledge management training over time.

Many of today's technologies make it far too easy to create content and far too difficult to manage it in perpetuity.  Content creation with longevity in mind is critical for the federal records management professional and for the federal government in general.

Implementing technologies that work together to achieve the longevity goal is paramount. No federal agency can survive on one tool; one tool rarely meets the variety of end user needs or requirements. Discovering and implementing technologies with easy interfaces, open APIs, and purposeful data exchange bases will be most successful in the federal government. Often this equates to open source tools, which are naturally built for easy expansion and integration with other tools.

Point 2:  Manage and Secure Through the Workflow
Very little happens in the federal government without being attached to a workflow.

  • Employee time is a workflow that leads to paychecks.
  • Purchasing small and large good is a workflow that leads to vendor payments and receipt of goods.
  • Asset management is a workflow from asset need to asset receipt to asset long-term disposition.
  • Analytic products are a workflow from inception to review to edit to publish.
  • Meetings are a workflow from establishment to agenda to minutes to action capture and tracking.
  • Federal budget creation is an uber-workflow from planning, programming, budgeting, and execution.
  • Grants management is a workflow from idea submission to review to approval to tracking progress.
  • Citizen services contain many workflows for social security payments, passport processing, visa approvals, small business loans, and so forth.

Introducing solid records management to these macro and micro workflow environments is necessary and important.

The federal government needs tools that understand the intricate workflow processes and seamlessly captures the changes, approvals, and actions for the workflow throughout the entire process-from creation to retirement. A suite of tools-built on open platforms for easy data exchange-is likely to be required for any federal agency. Working through big ERP systems and through small purpose-built systems, workflow foundations can capture information necessary for approvals and for long-term retention.

Equally necessary are workflow tools that maintain data integrity, individual privacy, and agency security. The Federal Government demands absolute security in processing workflows, especially for citizen-based services that span public and private information processing environments.  It's simply not enough to have workflow tools which are fundamentally secure in a private environment. Federal agencies need confidence when exchanging data from a mobile, citizen platform to a private, agency platform.

Point 3:  Archive Responsibly
Fundamental to our form of government is trust.  Trust of our people is fundamental.  Trust by our federal workforce is fundamental. Trust in our records and information is equally fundamental. When the Administration or the Hill or the People want to know what we knew and when we knew it, federal agencies need to be at the ready to provide the truth - with facts and records to support the facts.

The Federal Government and its agencies aren't private institutions. Although there is information that we should not keep, federal agencies should continue to err on the side of caution and keep anything that seems worth keeping. We should be prepared to keep more information and more records than legally required to lend credibility and understanding of historical decisions and outcomes.

Again, we need tools and technologies that make responsible records management and archival easier for everyone. The amount of resources spent by the federal government on review and redaction of federal records is staggering. If we could have technologies to cut the resources just by 10 percent, that would be awesome. Reaching 20 or 30 percent cost reductions would be phenomenal.

Key to reducing manpower in archival, review, and release, is solid creation at that start. At the risk of creating a circular reference, I'll take you back to my initial point of Content Management at Creation.

Summary

  • Federal agencies create more data and content than any of us cares to understand.
  • It's not all useful data and finding our way through the mountains of data to know and keep what's important is a tough job.
  • Securing the data to prevent harmful use and unlawful disclosure needs to be easier for federal agencies.
  • Knowing when a leak is harmful also needs to be easier for federal agencies.
  • Responding to appropriate releases of information-whether through freedom of information act requests or congressional inquiries-shouldn't be as hard as it is today.
  • Guaranteeing the safety and security of private citizen data isn't a desire...it's a demand.
  • The basic needs for federal agencies are:
    • Suites of tools that do a large amount of the content management;
    • Open interfaces and open source tools that allow affordable and extensible add-ons for special purposes;
    • Tools that facilitate reduced complexity for end users and IT departments; and
    • Tools that make a records management professional and an end user's job easier on a day-to-day basis.

More Stories By Jill Tummler Singer

Jill Tummler Singer is CIO for the National Reconnaissance Office (NRO)- which as part of the 16-member Intelligence Community plays a primary role in achieving information superiority for the U.S. Government and Armed Forces. A DoD agency, the NRO is staffed by DoD and CIA personnel. It is funded through the National Reconnaissance Program, part of the National Foreign Intelligence Program.

Prior to joining the NRO, Singer was Deputy CIO at the Central Intelligence Agency (CIA), where she was responsible for ensuring CIA had the information, technology, and infrastructure necessary to effectively execute its missions. Prior to her appointment as Deputy CIO, she served as the Director of the Diplomatic Telecommunications Service (DTS), United States Department of State, and was responsible for global network services to US foreign missions.

Singer has served in several senior leadership positions within the Federal Government. She was the head of Systems Engineering, Architecture, and Planning for CIA's global infrastructure organization. She served as the Director of Architecture and Implementation for the Intelligence Community CIO and pioneered the technology and management concepts that are the basis for multi-agency secure collaboration. She also served within CIA’s Directorate of Science and Technology.

@BigDataExpo Stories
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Ar...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enter...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
SYS-CON Events announced today that DragonGlass, an enterprise search platform, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. After eleven years of designing and building custom applications, OpenCrowd has launched DragonGlass, a cloud-based platform that enables the development of search-based applications. These are a new breed of applications that utilize a search index as their backbone for data...
SYS-CON Events announced today that EnterpriseDB (EDB), the leading worldwide provider of enterprise-class Postgres products and database compatibility solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. EDB is the largest provider of Postgres software and services that provides enterprise-class performance and scalability and the open source freedom to divert budget from more costly traditiona...
Gartner predicts that the bulk of new IT spending by 2016 will be for cloud platforms and applications and that nearly half of large enterprises will have cloud deployments by the end of 2017. The benefits of the cloud may be clear for applications that can tolerate brief periods of downtime, but for critical applications like SQL Server, Oracle and SAP, companies need a strategy for HA and DR protection. While traditional SAN-based clusters are not possible in these environments, SANless cluste...
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will address the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affec...
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
To manage complex web services with lots of calls to the cloud, many businesses have invested in Application Performance Management (APM) and Network Performance Management (NPM) tools. Together APM and NPM tools are essential aids in improving a business's infrastructure required to support an effective web experience... but they are missing a critical component - Internet visibility. Internet connectivity has always played a role in customer access to web presence, but in the past few years u...
There's Big Data, then there's really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at Big Data Expo®, Hannah Smalltree, Director at Treasure Data, discussed how IoT, Big D...
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager – Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, will review next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discuss how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Speaker Bios Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has b...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
Cloud services are the newest tool in the arsenal of IT products in the market today. These cloud services integrate process and tools. In order to use these products effectively, organizations must have a good understanding of themselves and their business requirements. In his session at 15th Cloud Expo, Brian Lewis, Principal Architect at Verizon Cloud, outlined key areas of organizational focus, and how to formalize an actionable plan when migrating applications and internal services to the ...
"We have developers who are really passionate about getting their code out to customers, no matter what, in the shortest possible time. Operations are very focused on procedures and policies," explained Stan Klimoff, CTO of Qubell, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Agility is top of mind for Cloud/Service providers and Enterprises alike. Policy Driven Data Center provides a policy model for application deployment by decoupling application needs from the underlying infrastructure primitives. In his session at 15th Cloud Expo, David Klebanov, a Technical Solutions Architect with Cisco Systems, discussed how it differentiates from the software-defined top-down control by offering a declarative approach to allow faster and simpler application deployment. Davi...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects - scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e....