Click here to close now.

Welcome!

Big Data Journal Authors: Elizabeth White, Liz McMillan, Dana Gardner, Carmen Gonzalez, Pat Romanski

Blog Feed Post

New year, new Congress: What’s next for DATA?

By

[This post by Hudson Hollister is cross-posted on the Data Transparency Coalition's blog.]

 

Tomorrow a new U.S. Congress – the 113th, elected last November – will be sworn in.  The 112th Congress finished its run with a divisive confrontation on fiscal priorities.

Advocates of opening up government data are rightfully disappointed at the lack of Congressional action on the Digital Accountability and Transparency Act, or DATA Act. The DATA Act, championed by Reps. Darrell Issa (R-CA) and Elijah Cummings (D-MD) in the House of Representatives and by Sens. Mark Warner (D-VA) and Rob Portman (R-OH) in the Senate, would have required the U.S. government to publish all executive-branch spending data on a single Internet portal. Spending data is currently reported to at least seven separate compilations maintained by different agencies, some publicly accessible and others not.

The DATA Act also would have set up consistent data identifiers and markup languages for all spending data, which would have allowed the separate compilations to be searched together, dramatically enhancing government accountability.

The DATA Act would generate new opportunities for the tech industry, as our Coalition demonstrated at its DATA Demo Day last summer. If U.S. federal spending data were consistently published and standardized – Big Data analytics companies could develop new ways to find waste and fraud; financial management vendors could offer new services to their existing federal clients; and infomediaries like Google could republish and repackage the data for the public and profit from advertising revenues.

What happened?

The DATA Act was first introduced in both chambers in June 2011, and shortly thereafter was approved by the House Committee on Oversight and Government Reform and sent to the full House of Representatives. After further revisions the bill passed the full House in April 2012. In September 2012, Sens. Warner and Portman rewrote and reintroduced the bill in the Senate to address objections from grantees and contractors. But the next step – consideration by the Senate Homeland Security and Governmental Affairs Committee (HSGAC) and the committee’s recommendation to the full Senate – never occurred, because the committee did not hold a business meeting between the DATA Act’s reintroduction and the end of the Congress. The HSGAC’s leaders – chairman Joseph Lieberman (I-CT) and ranking member Susan Collins (R-ME) – didn’t find the DATA Act important enough to merit the time and effort of a committee meeting, deliberation, and a vote.

With the end of the 112th Congress and the start of the 113th, all pending legislative proposals are wiped out and must be re-introduced.

Is the DATA Act dead? Absolutely not! Chances are good that the bill will be introduced again and passed – by both chambers, this time! – in 2013.

Here’s why.

  • The DATA Act’s champions aren’t going away. Rep. Issa, the DATA Act’s original author, and Sen. Warner, its first Senate sponsor, remain well-placed to champion a new version of the bill, with Issa keeping his position as chairman of the powerful House Committee on Oversight and Government Reform. Meanwhile, term limits and retirement will bring new leadership to the Senate HSGAC: chairman Tom Carper (D-DE) and ranking member Tom Coburn (R-OK). Carper and Coburn may well prove more enthusiastic about pursuing data transparency in federal spending than Lieberman and Collins were.
  • The Recovery Board’s spending accountability portal IS going away. The original DATA Act was based on lessons learned from the work of the Recovery Accountability and Transparency Board, the temporary federal agency established by Congress in 2009 to oversee the spending mandated by the economic stimulus law. The Recovery Board used standardized electronic identifiers and markup languages to make its online accountability portal fully accurate, complete, and searchable. According to the Recovery Board’s most recent report, inspectors general using the portal recovered $40 million in stimulus funds from questionable contractors and grantees and prevented $30 million from being paid out in the first place. The Recovery Board’s portal could easily be expanded to cover all spending rather than just stimulus spending. (No reliable government-wide spending accountability portal exists.) But the Recovery Board’s legal mandate only covers overseeing the stimulus. The temporary agency will be eliminated when the stimulus law expires on September 30, 2013. Unless Congress acts, the Recovery Board’s portal will be decommissioned on that date – and replaced with nothing. The Recovery Board’s imminent demise should put pressure on Congress to pass legislation to make sure that the time, money, and effort spent to create the stimulus accountability portal are not wasted. The April 2012 House version of the DATA Act – and amendments our Coalition suggested for the September 2012 Senate version – would do exactly that.
  • The press is taking notice. Last month, Washington Post columnist Dana Milbank asked why the DATA Act – “so uncontroversial that it passed the House on a voice vote” – has yet to achieve full passage. Milbank’s reaction is that of any smart layperson: the need to publish all federal spending data online, and make it fully searchable through electronic standardization, sounds like something the executive branch should be doing already – and if it isn’t, Congress should tell it to.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@BigDataExpo Stories
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Can the spatial component of your Big Data be harnessed and visualized, adding another dimension of power and analytics to your data? In his session at Big Data Expo®, John Meza, Product Engineer and Performance Engineering Team Lead at Esri, discussed the spatial queries that can be used within the Hadoop ecosystem and their integration with GeoSpatial applications. The GIS Tools for Hadoop project was also discussed and its implementation to discover location-based patterns and relationships...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will address the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affec...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
An effective way of thinking in Big Data is composed of a methodical framework for dealing with the predicted shortage of 50-60% of the qualified Big Data resources in the U.S. This holistic model comprises the scientific and engineering steps that are involved in accelerating Big Data solutions: problem, diagnosis, facts, analysis, hypothesis, solution, prototype and implementation. In his session at Big Data Expo®, Tony Shan focused on the concept, importance, and considerations for each of t...
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization's assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In...
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...
Storage administrators find themselves walking a line between meeting employees’ demands to use public cloud storage services, and their organizations’ need to store information on-premises for security, performance, cost and compliance reasons. However, as file sharing protocols like CIFS and NFS continue to lose their relevance, simply relying only on a NAS-based environment creates inefficiencies that hurt productivity and the bottom line. IT wants to implement cloud storage it can purchase a...
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is - how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, discussed the common mistakes businesses make when transitioning t...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
Are your Big Data initiatives resulting in Big Impact or Big Mess? In her session at Big Data Expo, Penelope Everall Gordon, Emerging Technology Strategist at 1Plug Corporation, shared her successes in improving Big Decision outcomes by building stories compelling to the target audience – and her failures when she lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. The cast of characters includes the agency head [city official? elected official?...
More organizations are embracing DevOps to realize compelling business benefits such as more frequent feature releases, increased application stability, and more productive resource utilization. However, security and compliance monitoring tools have not kept up and often represent the single largest remaining hurdle to continuous delivery. In their session at DevOps Summit, Justin Criswell, Senior Sales Engineer at Alert Logic, Ricardo Lupo, a Solution Architect with Chef, will discuss how to ...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
Move from reactive to proactive cloud management in a heterogeneous cloud infrastructure. In his session at 16th Cloud Expo, Manoj Khabe, Innovative Solution-Focused Transformation Leader at Vicom Computer Services, Inc., will show how to replace a help desk-centric approach with an ITIL-based service model and service-centric CMDB that’s tightly integrated with an event and incident management platform. Learn how to expand the scope of operations management to service management. He will al...
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...