Welcome!

@BigDataExpo Authors: Elizabeth White, Liz McMillan, Shelly Palmer, John Basso, Dana Gardner

Blog Feed Post

New year, new Congress: What’s next for DATA?

By

[This post by Hudson Hollister is cross-posted on the Data Transparency Coalition's blog.]

 

Tomorrow a new U.S. Congress – the 113th, elected last November – will be sworn in.  The 112th Congress finished its run with a divisive confrontation on fiscal priorities.

Advocates of opening up government data are rightfully disappointed at the lack of Congressional action on the Digital Accountability and Transparency Act, or DATA Act. The DATA Act, championed by Reps. Darrell Issa (R-CA) and Elijah Cummings (D-MD) in the House of Representatives and by Sens. Mark Warner (D-VA) and Rob Portman (R-OH) in the Senate, would have required the U.S. government to publish all executive-branch spending data on a single Internet portal. Spending data is currently reported to at least seven separate compilations maintained by different agencies, some publicly accessible and others not.

The DATA Act also would have set up consistent data identifiers and markup languages for all spending data, which would have allowed the separate compilations to be searched together, dramatically enhancing government accountability.

The DATA Act would generate new opportunities for the tech industry, as our Coalition demonstrated at its DATA Demo Day last summer. If U.S. federal spending data were consistently published and standardized – Big Data analytics companies could develop new ways to find waste and fraud; financial management vendors could offer new services to their existing federal clients; and infomediaries like Google could republish and repackage the data for the public and profit from advertising revenues.

What happened?

The DATA Act was first introduced in both chambers in June 2011, and shortly thereafter was approved by the House Committee on Oversight and Government Reform and sent to the full House of Representatives. After further revisions the bill passed the full House in April 2012. In September 2012, Sens. Warner and Portman rewrote and reintroduced the bill in the Senate to address objections from grantees and contractors. But the next step – consideration by the Senate Homeland Security and Governmental Affairs Committee (HSGAC) and the committee’s recommendation to the full Senate – never occurred, because the committee did not hold a business meeting between the DATA Act’s reintroduction and the end of the Congress. The HSGAC’s leaders – chairman Joseph Lieberman (I-CT) and ranking member Susan Collins (R-ME) – didn’t find the DATA Act important enough to merit the time and effort of a committee meeting, deliberation, and a vote.

With the end of the 112th Congress and the start of the 113th, all pending legislative proposals are wiped out and must be re-introduced.

Is the DATA Act dead? Absolutely not! Chances are good that the bill will be introduced again and passed – by both chambers, this time! – in 2013.

Here’s why.

  • The DATA Act’s champions aren’t going away. Rep. Issa, the DATA Act’s original author, and Sen. Warner, its first Senate sponsor, remain well-placed to champion a new version of the bill, with Issa keeping his position as chairman of the powerful House Committee on Oversight and Government Reform. Meanwhile, term limits and retirement will bring new leadership to the Senate HSGAC: chairman Tom Carper (D-DE) and ranking member Tom Coburn (R-OK). Carper and Coburn may well prove more enthusiastic about pursuing data transparency in federal spending than Lieberman and Collins were.
  • The Recovery Board’s spending accountability portal IS going away. The original DATA Act was based on lessons learned from the work of the Recovery Accountability and Transparency Board, the temporary federal agency established by Congress in 2009 to oversee the spending mandated by the economic stimulus law. The Recovery Board used standardized electronic identifiers and markup languages to make its online accountability portal fully accurate, complete, and searchable. According to the Recovery Board’s most recent report, inspectors general using the portal recovered $40 million in stimulus funds from questionable contractors and grantees and prevented $30 million from being paid out in the first place. The Recovery Board’s portal could easily be expanded to cover all spending rather than just stimulus spending. (No reliable government-wide spending accountability portal exists.) But the Recovery Board’s legal mandate only covers overseeing the stimulus. The temporary agency will be eliminated when the stimulus law expires on September 30, 2013. Unless Congress acts, the Recovery Board’s portal will be decommissioned on that date – and replaced with nothing. The Recovery Board’s imminent demise should put pressure on Congress to pass legislation to make sure that the time, money, and effort spent to create the stimulus accountability portal are not wasted. The April 2012 House version of the DATA Act – and amendments our Coalition suggested for the September 2012 Senate version – would do exactly that.
  • The press is taking notice. Last month, Washington Post columnist Dana Milbank asked why the DATA Act – “so uncontroversial that it passed the House on a voice vote” – has yet to achieve full passage. Milbank’s reaction is that of any smart layperson: the need to publish all federal spending data online, and make it fully searchable through electronic standardization, sounds like something the executive branch should be doing already – and if it isn’t, Congress should tell it to.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@BigDataExpo Stories
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...