Welcome!

@BigDataExpo Authors: Rishi Bhargava, Pat Romanski, Elizabeth White, ManageEngine IT Matters, Liz McMillan

Blog Feed Post

Well Engineered use of AWS by Recovery and Transparency Board (RATB)

By

After speaking with Shawn Kingsberry in preparations for our 4 April Government Big Data Forum I realized their use of Amazon Web Services (AWS) may be of very high interest to our readers and went about looking for more info online to see what was publicly available. I was ecstatic to see a well written use case for much of it is on the AWS website. That write-up includes a nice graphic that is helpful to understanding how things were done.

Since this is provided by AWS as a way of articulating their special contributions it does not go into the many other services and components required to make this work. But many of those components are probably modular and exchangable with other capabilities. So this overview is probably a great way to get a baseline on what the RATB architecture is.

With that, the following is from: http://aws.amazon.com/solutions/case-studies/ratb/

AWS Case Study: Recovery.gov and AWS Bring Transparency to the Cloud

The Recovery Accountability and Transparency Board (RATB) was established when Congress passed the American Recovery and Reinvestment Act (ARRA) in February, 2009. To ensure against waste, fraud, and abuse, the RATB was tasked with developing a Website which met the following goals:

  • Provide easily accessible information to the public on Recovery spending and results
  • Promote official data in public debate
  • Provide fair and open access to Recovery opportunities
  • Enable public accountability for Recovery spending
  • Promote an understanding of the local impact of Recovery spending

The resulting Website is Recovery.gov.

The RATB originally intended to use Amazon Web Services (AWS) only for development, testing, and as failover, but, says Jim Warren, RATB Chief Information Officer, “When AWS outperformed our on-premises solution at a fraction of the cost, the prime contractor Smartronix and its lead sub-contractor Synteractive, provided a compelling justification for the RATB to host Recovery.gov on AWS’s platform.”

According to Mr. Warren, Smartronix selected AWS because of the flexibility provided by AWS’s Infrastructure as a Service (IaaS) model; track record of providing infrastructure for large-scale commercial projects; focus on cost-effectiveness and a pay-as-you-go-model that allowed Smartronix to control costs; commitment to security and reliability; and its FISMA Low certification.

The RATB now uses the following AWS services: Amazon Elastic Compute Cloud (Amazon EC2), Amazon Simple Storage Service (Amazon S3), Amazon Elastic Block Storage (Amazon EBS), Elastic Load Balancing (ELB), and Amazon CloudWatch. The solution also combined multiple pieces of software.

The following diagrams illustrate their topology:

 

ratb arch diagram Well Engineered use of AWS by Recovery and Transparency Board (RATB)

Recovery Accountability and Transparency Board

Business Intelligence and Data Warehousing
The website uses Microsoft’s SharePoint as it content management system and all data is aggregated into a global dimensional data warehouse to facilitate time-based analysis and reporting. The solution leverages SAP BusinessObjects and Microsoft SQL Server for reporting services that show how and where the money is being spent. The BI tools enable ad hoc reporting and are instrumental in Data Quality and Data Integrity score-carding.

Advanced Geospatial Analysis and Mapping
The Geospatial tools, based on ESRI software, allow up to 5,000 concurrent users and enables them to go directly to go to their communities of interest at the state, zip, congressional district, or county level. Hundreds of thousands of addresses are geo-coded and aggregated to display total value for each area of interest. Thematic maps and multiple view selections were incorporated to help the user better visualize the data. These thematics include funding heat maps, unemployment heat maps, and diversity maps.

Mr. Warren notes that testing and development enclaves were procured and ready on Amazon EC2 within two days of the contract award. He says, “Our migration to the cloud took only 22 days from feasibility study to production.” The RATB has also enjoyed improved computer security, including greater protection against network attacks and real-time detection of system tampering. Mr. Warren says, “In essence, the security system of AWS’s platform has been added to our existing security systems. We now have a security posture consistent with that of a multi-billion dollar company.” Additional benefits include lower costs and ability to add capacity on demand. The RATB expects to save around $750K during their current budget cycle.

The success of Recovery.gov is being noticed outside of the RATB as well: Andre Romano of Newsweek wrote, “The current incarnation of Recovery.gov…is perhaps the clearest, richest interactive database ever produced by the American bureaucracy.” The site has been given the 2009 Merit award, the 2010 Gold Addy award for Website design, InformationWeek Government IT Innovator 2010 Award, an Award of Distinction during the 16th Annual Communicator Awards, and a second place Gold Screen Award from the National Association of Government Communicators. Recovery.gov is also an official Honoree for the Financial Services category in the 14th Annual Webby Awards.

To learn more see http://recovery.gov

 Well Engineered use of AWS by Recovery and Transparency Board (RATB)

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@BigDataExpo Stories
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.