Click here to close now.

Welcome!

Big Data Journal Authors: Carmen Gonzalez, Leo Reiter, Klaus Enzenhofer, Roger Strukhoff, Michael Jannery

Blog Feed Post

Well Engineered use of AWS by Recovery and Transparency Board (RATB)

By

After speaking with Shawn Kingsberry in preparations for our 4 April Government Big Data Forum I realized their use of Amazon Web Services (AWS) may be of very high interest to our readers and went about looking for more info online to see what was publicly available. I was ecstatic to see a well written use case for much of it is on the AWS website. That write-up includes a nice graphic that is helpful to understanding how things were done.

Since this is provided by AWS as a way of articulating their special contributions it does not go into the many other services and components required to make this work. But many of those components are probably modular and exchangable with other capabilities. So this overview is probably a great way to get a baseline on what the RATB architecture is.

With that, the following is from: http://aws.amazon.com/solutions/case-studies/ratb/

AWS Case Study: Recovery.gov and AWS Bring Transparency to the Cloud

The Recovery Accountability and Transparency Board (RATB) was established when Congress passed the American Recovery and Reinvestment Act (ARRA) in February, 2009. To ensure against waste, fraud, and abuse, the RATB was tasked with developing a Website which met the following goals:

  • Provide easily accessible information to the public on Recovery spending and results
  • Promote official data in public debate
  • Provide fair and open access to Recovery opportunities
  • Enable public accountability for Recovery spending
  • Promote an understanding of the local impact of Recovery spending

The resulting Website is Recovery.gov.

The RATB originally intended to use Amazon Web Services (AWS) only for development, testing, and as failover, but, says Jim Warren, RATB Chief Information Officer, “When AWS outperformed our on-premises solution at a fraction of the cost, the prime contractor Smartronix and its lead sub-contractor Synteractive, provided a compelling justification for the RATB to host Recovery.gov on AWS’s platform.”

According to Mr. Warren, Smartronix selected AWS because of the flexibility provided by AWS’s Infrastructure as a Service (IaaS) model; track record of providing infrastructure for large-scale commercial projects; focus on cost-effectiveness and a pay-as-you-go-model that allowed Smartronix to control costs; commitment to security and reliability; and its FISMA Low certification.

The RATB now uses the following AWS services: Amazon Elastic Compute Cloud (Amazon EC2), Amazon Simple Storage Service (Amazon S3), Amazon Elastic Block Storage (Amazon EBS), Elastic Load Balancing (ELB), and Amazon CloudWatch. The solution also combined multiple pieces of software.

The following diagrams illustrate their topology:

 

ratb arch diagram Well Engineered use of AWS by Recovery and Transparency Board (RATB)

Recovery Accountability and Transparency Board

Business Intelligence and Data Warehousing
The website uses Microsoft’s SharePoint as it content management system and all data is aggregated into a global dimensional data warehouse to facilitate time-based analysis and reporting. The solution leverages SAP BusinessObjects and Microsoft SQL Server for reporting services that show how and where the money is being spent. The BI tools enable ad hoc reporting and are instrumental in Data Quality and Data Integrity score-carding.

Advanced Geospatial Analysis and Mapping
The Geospatial tools, based on ESRI software, allow up to 5,000 concurrent users and enables them to go directly to go to their communities of interest at the state, zip, congressional district, or county level. Hundreds of thousands of addresses are geo-coded and aggregated to display total value for each area of interest. Thematic maps and multiple view selections were incorporated to help the user better visualize the data. These thematics include funding heat maps, unemployment heat maps, and diversity maps.

Mr. Warren notes that testing and development enclaves were procured and ready on Amazon EC2 within two days of the contract award. He says, “Our migration to the cloud took only 22 days from feasibility study to production.” The RATB has also enjoyed improved computer security, including greater protection against network attacks and real-time detection of system tampering. Mr. Warren says, “In essence, the security system of AWS’s platform has been added to our existing security systems. We now have a security posture consistent with that of a multi-billion dollar company.” Additional benefits include lower costs and ability to add capacity on demand. The RATB expects to save around $750K during their current budget cycle.

The success of Recovery.gov is being noticed outside of the RATB as well: Andre Romano of Newsweek wrote, “The current incarnation of Recovery.gov…is perhaps the clearest, richest interactive database ever produced by the American bureaucracy.” The site has been given the 2009 Merit award, the 2010 Gold Addy award for Website design, InformationWeek Government IT Innovator 2010 Award, an Award of Distinction during the 16th Annual Communicator Awards, and a second place Gold Screen Award from the National Association of Government Communicators. Recovery.gov is also an official Honoree for the Financial Services category in the 14th Annual Webby Awards.

To learn more see http://recovery.gov

 Well Engineered use of AWS by Recovery and Transparency Board (RATB)

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@BigDataExpo Stories
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...