Welcome!

Big Data Journal Authors: Carmen Gonzalez, Roger Strukhoff, Jason Bloomberg, Trevor Parsons, Keith Cawley

Related Topics: Big Data Journal, SOA & WOA, Virtualization, Cloud Expo, GovIT, SDN Journal

Big Data Journal: Blog Post

Public Sector Big Data: Five Ways Big Data Must Evolve in 2013

2012 will go down as a “Big” year for Big Data in the public sector

By

Editor’s note: This guest post provides context on mission focused data analytics in the federal space by one of the leaders of the federal big data movement, Ray Muslimani. -bg

2012 will go down as a “Big” year for Big Data in the public sector. Rhetoric and hype has been followed by tangible action on the part of both government and industry. The $200 million Big Data initiative unveiled by the White House in March 2012 was an injection of R&D and credibility towards efforts to develop tools and technologies to help solve the nation’s most pressing challenges.

On the industry side, the recently issued TechAmerica report, “Demystifying Big Data,” provides agencies with a roadmap for using Big Data to better serve citizens. It also offers a set of policy recommendations and practical steps agencies can take to get started with Big Data initiatives.

For all of the enthusiasm around Big Data this year, every indication is that 2013 will be the year when Big Data transforms the business of government. Below are 5 steps that need to be taken in order for Big Data to evolve in 2013 and deliver on its promise.

Demystify Big Data
Government agencies warmed to the potential of Big Data throughout 2012, but more education is required to help decision makers wade through their options and how further investments can be justified. Removing the ambiguitiessurrounding Big Data requires an emphasis in 2013 on education from both industry and government.

The TechAmerica Big Data report is a good example of how industry can play an active role in guiding agencies through Big Data initiatives. It also underscores that vendors can’t generate more Big Data RFPs through marketing slicks and sales tactics alone. This approach will not demystify Big Data – it will simply seed further doubt if providers of Big Data tools and solutions focus only on poking holes in competitor alternatives.

Industry and government should follow proven templates for education in 2013. For example, agencies can arrange “Big Data Days” in a similar format as Industry Tech Days occur today. Big Data industry days can help IT providers gain better insight into how each Agency plans to approach their Big Data challenges in 2013 and offer these agencies an opportunity to see a wide range of Big Data services.

The Big Data education process must also extend to contracting officers. Agencies need guidance on how RFPs can be constructed to address a service-based model.

Consumerize Big Data
While those within the public sector with the proper training and skills to analyze data have benefited from advanced Big Data tools, it has been far more difficult for everyday business users and decision makers to access the data in a useful way. Sluggish data query responses, data quality issues, and a clunky user experience is undermining the benefits Big Data Analytics can deliver and requiring users to be de facto “data scientists” to make sense of it all.

Supporting this challenge is a 2012 MeriTalk survey, “The Big Data Gap,” that finds just 60 percent of IT professionals indicate their agency is analyzing the data it collects and a modest 40 percent are using data to make strategic decisions. All of this despite the fact that 96 percent of those surveyed expects their agency’s stored data to grow in the next two years by an average of 64 percent.  The gap here suggests a struggle for non “data scientists” to convert data into business decisions. 

What if any government user could ask a question in natural language and receive the answer in a relevant visualization?  For Big Data to evolve in 2013 we must consumerize the user experience by removing spreadsheets and reports, and place the power of analytics in the hands of users of any level without analytics expertise.

Mobilize Big Data
IDC Government Insights predicts that in 2013, 35 percent of new Federal and state applications will be mobile. At the same time, 65 percent of Federal IT executives expect mobile device use to increase by 20 percent in 2013, according to The 2012-2013 Telework/Mobile IT Almanac.

Part of consumerizing Big Data means building it for any device so that users do not need to be tethered to their desktops to analyze data. Agency decision makers must be empowered to easily view and analyze data on tablets and smartphones, while the increase of teleworking in the public sector requires Big Data to be accessible from anywhere, at any time, and on any device.

There is promising innovation at work by both established Federal IT providers and upstarts in taking a mobile-first path to Big Data, rather than the traditional approach of building BI dashboards for the desktop. The degree to which 2013 sees a shift in Big Data from the desktop to tablets and smartphones will depend on how forcefully solutions providers employ a mobile-first approach to Big Data.

Act on Big Data
A tremendous amount of “thought” energy went into Big Data in 2012. For Big Data to evolve in a meaningful way in 2013, initiatives and studies must generate more action in the form of Big Data RFIs and RFPs.

Within the tight budget climate, agencies will not act on Big Data if vendor proposals require massive investments in IT infrastructure and staffing. There must be a shift –to the extent possible – of the financial and resource burden from agency to vendor. For example, some vendors have developed “Big Data Clouds” that allow agencies to leverage a secure, scalable framework for storing and managing data, along with a toolset for performing consumer-grade search and analysis on that data.

Open Big Data
Adoption of Big Data solutions has been accelerated by open source tools such as Hadoop, MapReduce, Hive, and HBase. While some agencies will find it tempting to withdraw to the comfort of proprietary Big Data tools that they can control in closed systems, that path undermines the value Big Data can ultimately deliver.

One could argue that as open source goes in 2013, Big Data goes as well. If open source platforms and tools continue to address agency demands for security, scalability, and flexibility, benefits within from Big Data within and across agencies will increase exponentially. There are hundreds of thousands of viable open source technologies on the market today. Not all are suitable for agency requirements, but as agencies update and expand their uses of data, these tools offer limitless opportunities to innovate. Additionally, opting for open source instead of proprietary vendor solutions prevents an agency from being locked into a single vendor’s tool that it may at some point outgrow or find ill suited for their needs.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@BigDataExpo Stories
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce t...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo--to be held November 4-6 at the Santa Clara Convention Center in the heart of Silicon Valley--will expand the DevO...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation...
The Internet of Things (IoT) promises to create new business models as significant as those that were inspired by the Internet and the smartphone 20 and 10 years ago. What business, social and practical implications will this phenomenon bring? That's the subject of "Monetizing the Internet of Things: Perspectives from the Front Lines," an e-book released today and available free of charge from Aria Systems, the leading innovator in recurring revenue management.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, B...
Quantum is a leading expert in scale-out storage, archive and data protection, providing intelligent solutions for capturing, sharing and preserving digital assets over the entire data lifecyle. They help customers maximize the value of these assets to achieve their goals, whether it’s top movie studios looking to create the next blockbuster, researchers working to accelerate scientific discovery, or small businesses trying to streamline their operations. With a comprehensive portfolio of best-i...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, will discuss single-value, geo-spatial, and log time series ...
SimpleECM is the only platform to offer a powerful combination of enterprise content management (ECM) services, capture solutions, and third-party business services providing simplified integrations and workflow development for solution providers. SimpleECM is opening the market to businesses of all sizes by reinventing the delivery of ECM services. Our APIs make the development of ECM services simple with the use of familiar technologies for a frictionless integration directly into web applicat...
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, will discuss ho...
Dyn solutions are at the core of Internet Performance. Through traffic management, message management and performance assurance, Dyn is connecting people through the Internet and ensuring information gets where it needs to go, faster and more reliably than ever before. Founded in 2001 at WPI, Dyn’s global presence services more than four million enterprise, small business and personal customers.
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. ...
You use an agile process; your goal is to make your organization more agile. But what about your data infrastructure? The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application an...
SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer's customers range from Web startups to global enterprises. Products and services include bare metal and virtual servers, networking, turnkey big data solutions, private cloud solutions, and more. SoftLayer's unique advantages include the industry's first Network-Within-a-Network topology for true out-of-band access, and an easy-to-...
Despite the fact that majority of developers firmly believe that “it worked on my laptop” is a poor excuse for production failures, most don’t truly understand why it is virtually impossible to make your development environment representative of production. When asked, the primary reason for the production/development difference everyone mentions is technology stack spec/configuration differences. While it’s true, thanks to the black magic of Cloud (capitalization intended) with a bit of wizard...
SYS-CON Events announced today that AppDynamics will exhibit at DevOps Summit Silicon Valley, which will take place November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Digital businesses like yours need a way to turn data into actual results. AppDynamics is ushering in the next digital age – the age of the software-defined business. AppDynamics’ mission is to deliver true application intelligence that helps your software-defined business run faster, leaner, and more ef...
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In their General Session at 15th Cloud Expo, Phil Jackson, Development Community Advocate at SoftLayer, and Harold Hannon, Sr. Software Architect at SoftLayer, to discuss how to take advantage of a multitude of compute option...
Predicted by Gartner to add $1.9 trillion to the global economy by 2020, the Internet of Everything (IoE) is based on the idea that devices, systems and services will connect in simple, transparent ways, enabling seamless interactions among devices across brands and sectors. As this vision unfolds, it is clear that no single company can accomplish the level of interoperability required to support the horizontal aspects of the IoE. The AllSeen Alliance, announced in December 2013, was formed wi...