Click here to close now.


@BigDataExpo Authors: Elizabeth White, Liz McMillan, AppDynamics Blog, Yeshim Deniz, Ed Featherston

Related Topics: @CloudExpo, Java IoT, Linux Containers, @BigDataExpo

@CloudExpo: Article

Monitoring and Analyzing AWS CloudTrail Data

Monitoring and Analyzing AWS CloudTrail data from multiple AWS regions

We recently released AWS CloudTrail integration with Logentries - and not surprisingly we've seen a significant uptick in adoption as one of our most popular integrations. My job as director of customer success is to make things as simple for our customers as possible. One question that consistently pops up, is how to collect AWS CloudTrail logs from multiple AWS regions.

We follow Amazon's best practices when it comes to integrating with, and receiving information from, CloudTrail. In short, this works as follows:

  • When configuring CloudTrail, it will write events to a S3 bucket.
  • You can configure Cloudtrail to send notifications to an Amazon SNS topic whenever new log events are recorded.
  • You can get updates sent to an Amazon Simple Queue Service (Amazon SQS) queue, which enables you to handle these notifications programmatically.
  • To configure Logentries to consume your Cloudtrail logs, simply add the URL of the SQS queue to the Logentries/Cloudtrail setup page.

Logentries speaks directly to the SQS queue inside of your AWS account, so an obvious question that presents itself is: If I'm running in multiple AWS regions, how do I get Logentries to pull from all of the regions?

The simple answer: you don't. Make AWS do the work for you!

Following the steps outlined below, you'll be able to monitor and analyze CloudTrail logs from any number of AWS regions all within one Logentries account.

Create an S3 Bucket
If you're new to the CloudTrail setup, the first requirement of CloudTrail logging is that the logs must go "somewhere." In AWS, this somewhere happens to be a S3 bucket which you should create. Simply navigate to the S3 service and select ‘Create Bucket'. By default, all permissions required are given to the bucket - i.e. there is no extra permissions/configuration necessary to configure CloudTrail logging with Logentries.
Screen Shot 2014-08-19 at 12.55.03 PM

Create an SQS Queue in a Primary Region
Next up, we need to create an SQS Queue to allow Logentries to consume your CloudTrail data. Create a new Queue and provide a ‘Queue Name' - default options are fine.

Add permissions to the SQS Queue
Once the queue has been created, the correct permissions must be applied. When adding permissions to the SQS queue, you need to add your full account number/name (officially called the AWS User ARN).

To get the User ARN navigate to the IAM Service, select the user that you want to utilize and click ‘Summary'. When the user is created within the IAM section, make sure that the user has at least ‘Read-Only' access - so that the user has the relevant permissions to read the bucket. The string you need is available under User ARN in the ‘Summary' section and follows this format:


Next add ‘Receive', ‘Send', and ‘Delete' Actions to the SQS Queue(see below):
Enable CloudTrail in any region, and publish to an SNS topic

Once the above three steps have been complete, it's time to enable CloudTrail in the relevant regions. Navigate to the CloudTrail Service in your AWS Console and turn on CloudTrail. Do not create a new S3 bucket, but instead select the S3 Bucket created in step one above from the drop down menu. Once you've done this, click the Advanced link. For the first region you enable CloudTrail for, remember to include Global Services under Advanced options - this record API calls from any global AWS services such as IAM or AWS STS. Make sure that "SNS Notification for every log file delivery" is checked, and finally, specify a SNS Topic to publish to. A new SNS topic name should be given - and will be created by CloudTrail.

Follow these above steps for each region that you want to collect CloudTrail logs from. NOTE: when adding subsequent regions you will want to exclude Global Services to avoid duplicate log events being recorded for your Global Services.

Screen Shot 2014-08-19 at 2.48.02 PM
Subscribe the SQS Queue to the multiple SNS topics
Once each region has been setup, the last step in AWS is to subscribe your SQS Queue to each newly created SNS topic. Navigate to the SQS Service in your AWS Console and highlight the queue created in step 2 above. Under the ‘Queue Actions' menu at the top select ‘Subscribe Queue to SNS Topic'. Use the ‘Topic Region' drop down to select the region and the ‘Choose a Topic' drop down to select the topic created in the previous step. Hit the ‘Subscribe' button and wait for the confirmation that the queue has subscribed to that topic.

After selecting subscribe make sure to copy the SQS URL from the ‘Details' section on the page.

Setup Logentries to Pull data from the SQS Queue
Login to your Logentries account and navigate to your AWS settings area (My Account -> AWS). Select enable CloudTrail, supply your IAM access key, secret key, and SQS URL that you have copied above. Hit Save! Note: your IAM access key and secret key are made available to your when you create a new IAM user and should be stored safely.

Log Data from CloudTrail will begin to stream in within approximately 15 minutes.

Sit back and let Logentries do it's magic!
Visit our CloudTrail documentation to see some of the other cool things you can do - in particular we provide out of the box tags and alerts for important CloudTrail events. Have questions or ideas how we can make our CloudTrail integration better? Reach out to me directly at [email protected].


More Stories By Trevor Parsons

Trevor Parsons is Chief Scientist and Co-founder of Logentries. Trevor has over 10 years experience in enterprise software and, in particular, has specialized in developing enterprise monitoring and performance tools for distributed systems. He is also a research fellow at the Performance Engineering Lab Research Group and was formerly a Scientist at the IBM Center for Advanced Studies. Trevor holds a PhD from University College Dublin, Ireland.

@BigDataExpo Stories
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
In their session at DevOps Summit, Asaf Yigal, co-founder and the VP of Product at, and Tomer Levy, co-founder and CEO of, will explore the entire process that they have undergone – through research, benchmarking, implementation, optimization, and customer success – in developing a processing engine that can handle petabytes of data. They will also discuss the requirements of such an engine in terms of scalability, resilience, security, and availability along with how the archi...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated a...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, ...
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innov...
Redis is not only the fastest database, but it has become the most popular among the new wave of applications running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 17th Cloud Expo, Dave Nielsen, Developer Relations at Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity