Click here to close now.

Welcome!

Big Data Journal Authors: Elizabeth White, Lori MacVittie, Carmen Gonzalez, Ian Khan, Harry Trott

Blog Feed Post

New year, new Congress: What’s next for DATA?

By

[This post by Hudson Hollister is cross-posted on the Data Transparency Coalition's blog.]

 

Tomorrow a new U.S. Congress – the 113th, elected last November – will be sworn in.  The 112th Congress finished its run with a divisive confrontation on fiscal priorities.

Advocates of opening up government data are rightfully disappointed at the lack of Congressional action on the Digital Accountability and Transparency Act, or DATA Act. The DATA Act, championed by Reps. Darrell Issa (R-CA) and Elijah Cummings (D-MD) in the House of Representatives and by Sens. Mark Warner (D-VA) and Rob Portman (R-OH) in the Senate, would have required the U.S. government to publish all executive-branch spending data on a single Internet portal. Spending data is currently reported to at least seven separate compilations maintained by different agencies, some publicly accessible and others not.

The DATA Act also would have set up consistent data identifiers and markup languages for all spending data, which would have allowed the separate compilations to be searched together, dramatically enhancing government accountability.

The DATA Act would generate new opportunities for the tech industry, as our Coalition demonstrated at its DATA Demo Day last summer. If U.S. federal spending data were consistently published and standardized – Big Data analytics companies could develop new ways to find waste and fraud; financial management vendors could offer new services to their existing federal clients; and infomediaries like Google could republish and repackage the data for the public and profit from advertising revenues.

What happened?

The DATA Act was first introduced in both chambers in June 2011, and shortly thereafter was approved by the House Committee on Oversight and Government Reform and sent to the full House of Representatives. After further revisions the bill passed the full House in April 2012. In September 2012, Sens. Warner and Portman rewrote and reintroduced the bill in the Senate to address objections from grantees and contractors. But the next step – consideration by the Senate Homeland Security and Governmental Affairs Committee (HSGAC) and the committee’s recommendation to the full Senate – never occurred, because the committee did not hold a business meeting between the DATA Act’s reintroduction and the end of the Congress. The HSGAC’s leaders – chairman Joseph Lieberman (I-CT) and ranking member Susan Collins (R-ME) – didn’t find the DATA Act important enough to merit the time and effort of a committee meeting, deliberation, and a vote.

With the end of the 112th Congress and the start of the 113th, all pending legislative proposals are wiped out and must be re-introduced.

Is the DATA Act dead? Absolutely not! Chances are good that the bill will be introduced again and passed – by both chambers, this time! – in 2013.

Here’s why.

  • The DATA Act’s champions aren’t going away. Rep. Issa, the DATA Act’s original author, and Sen. Warner, its first Senate sponsor, remain well-placed to champion a new version of the bill, with Issa keeping his position as chairman of the powerful House Committee on Oversight and Government Reform. Meanwhile, term limits and retirement will bring new leadership to the Senate HSGAC: chairman Tom Carper (D-DE) and ranking member Tom Coburn (R-OK). Carper and Coburn may well prove more enthusiastic about pursuing data transparency in federal spending than Lieberman and Collins were.
  • The Recovery Board’s spending accountability portal IS going away. The original DATA Act was based on lessons learned from the work of the Recovery Accountability and Transparency Board, the temporary federal agency established by Congress in 2009 to oversee the spending mandated by the economic stimulus law. The Recovery Board used standardized electronic identifiers and markup languages to make its online accountability portal fully accurate, complete, and searchable. According to the Recovery Board’s most recent report, inspectors general using the portal recovered $40 million in stimulus funds from questionable contractors and grantees and prevented $30 million from being paid out in the first place. The Recovery Board’s portal could easily be expanded to cover all spending rather than just stimulus spending. (No reliable government-wide spending accountability portal exists.) But the Recovery Board’s legal mandate only covers overseeing the stimulus. The temporary agency will be eliminated when the stimulus law expires on September 30, 2013. Unless Congress acts, the Recovery Board’s portal will be decommissioned on that date – and replaced with nothing. The Recovery Board’s imminent demise should put pressure on Congress to pass legislation to make sure that the time, money, and effort spent to create the stimulus accountability portal are not wasted. The April 2012 House version of the DATA Act – and amendments our Coalition suggested for the September 2012 Senate version – would do exactly that.
  • The press is taking notice. Last month, Washington Post columnist Dana Milbank asked why the DATA Act – “so uncontroversial that it passed the House on a voice vote” – has yet to achieve full passage. Milbank’s reaction is that of any smart layperson: the need to publish all federal spending data online, and make it fully searchable through electronic standardization, sounds like something the executive branch should be doing already – and if it isn’t, Congress should tell it to.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@BigDataExpo Stories
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.