|By Sidney Smith||
|March 29, 2013 02:26 PM EDT||
Usually most people go straight for connecting vCAC to vCenter, but I have decided to connect to Amazon EC2 first. I’m doing this for a few reasons, but mainly because anyone reading this has access to EC2. All you really need is any computer with a Desktop Virtualization tool like VMware workstation and you can test vCAC with Amazon EC2. If you don’t have an Amazon AWWS account go to http://aws.amazon.com and sign-up.
Signing up for Amazon AWS is free and what’s even better is you can also provision “Micro.Instances” for free for an entire year as long as you stay within these guidelines. The basics are this:
- 750 Hours of Linux/Windows Micro Instance Usage per month. (613Mb Memory). This is enough to run a single micro instance for the whole month.
- 750 Hours of Elastic Load Balancing plus 15GB of data processing
- 30GB of Elastic Block Storage
- 5GB of S3 Storage with 20,000 Get requests and 2,000 Put requests
- And some other goodies…..
You can run more than one micro instance at a time as long as the consecutive run time of your machines doesn’t go over 750 hours a month. Once you provision an instance it automatically counts as 15 minutes used. I don’t bother trying to calculate by the 15 minutes so the way I look at it is I can perform 750 provisioning tests per month if each test is less than an hour.
Before we begin the configuration there are a few things we need in place. If you don’t already have vCAC installed and the foundation laid check out these posts to get going:
- vCloud Automation Center – What to know before you install!
- vCloud Automation Center – vCAC Manager Installation
- vCloud Automation Cetner – DEM Installation
- vCloud Automation Center – Laying the foundation
What were going to configure
In order to configure EC2 integration we are going to setup some additional components of vCAC as outlined below:
- Credentials -Credentials will be utilized by out endpoints to authenticate us to the infrastructure element managers that we are going to communicate with.
- End Point – Endpoints are how we manage connections from vCAC to other infrastructure elements in the environment. There are endpoints that allow us to communicate with EC2, vCenter, vCloud Director, vCenter Orchestrator, Hyper-V, NetApp Filers, as well as Physical Servers such as HP iLO, Dell iDrac, and Cisco UCS.
- Enterprise Group – Although we already created an Enterprise Group we are going to add Compute Resources to the group in this exercise. FOr more information on what Enterprise Groups are see my earlier article “vCloud Automation Center – Laying the foundation“.
- Reservations – A resource reservation is how we provide available resources to our provisioning groups. Resource Reservation are a one to one mapping to provisioning groups. Resource reservation will get created for any type of resources you want to make available to your groups. we will discuss these in more detail in another article.
- Global Blueprints – A Blueprint is really a service definition that details what the consumer can request and all the policies and configuration of that service. We will create an Amazon Ec2 Blueprint that a consumer can request through the service catalog in this example. I will cover Blueprints in greater detail in another article.
Configuring vCAC to provision to Amazon EC2
1.) The first thing we need to do is log into the vCAC console at “http://[host]/dcac“, then go to the “vCAC Administrator” menu on the “Left” and select “Credentials“.
2.) On the “Credentials” page select “New Credentials” in the “Upper Right” corner.
3.) Give your “Credential” a “Name” and “Description“. We then need to get your Amazon AWS “Access Key ID” and “Secret Access Key” which are covered in the following steps. The “Access Key ID” will be your “Username” and the “Secret Access Key” will be used as the “Password“.
Getting your AWS Access Key ID and Secret Access Key
4.)Login to your Amazon AWS account at “http://aws.amazon.com“. At the top “Right” corner “Hover” over “My Account/Console” and then select “Security Credentials”
5.) Scroll Down the page until you set to the section labeled “Access Credentials” and you will see your “Access Key ID” displayed. Copy and paste this in the “Credentials” “Username” field.
6.) Next “Click” “Show” to display your “Secret Access Key“. Copy and paste this into the “Credentials” “Password” Fields.
7.) Once you have input your “Username” and “Password” click the “Green” check on the “Left” hand side.
Creating an EndPoint”
8.) Next go to “vCAC Administrator” menu and “Click” “Endpoints” Once the “EndPoints” page displays “Hover” over “New EndPoint” and select “Amazon EC2“.
9.) Give your “Endpoint” and “Name” and then “click” the selection box next to “Credentials“. Select the “Amazon EC2” “Credentials” you just created and “Click” “Ok“., then “Click” “Ok” on the “New Endpoint” Screen.
10.) You will now see your newly crated Endpoint listed on the Endpoints screen. At this point vCAC executes a workflows that connects to Amazon AWS and validates your Credentials. If your credentials are validated the workflow will proceed to do a Data Discovery. The discovery will detect the available Amazon EC2 resources available for use. Once the discovery if finished the Amazon EC2 resources will become available within the “Enterprise Group” for selection.
Adding Compute Resources to an Enterprise Group
11.) Next let’s go to the “vCAC Administrators” menu and select “Enterprise Groups“. Once on the “Enterprise Groups” page “Hover” over the “Enterprise Group” we created and “select” “Edit”
12.) In the “Enterprise Group” we now see the “Amazon Regions” that are available. Select the “Amazon Region” that you would like to use and “Click” “Ok“.
13.) Next if you go to the “Enterprise Administrators” Menu on the left and select “Compute Resources” you will see a “Compute Resource” for each “Amazon Region” you selected. Once the “Compute Resource” is available we can create a “Resource Reservation” to assign to our “Provisioning Group“.
Creating a Reservation
14.)On the “Enterprise Administrators” menu select “Reservations” and then “Hover” over “New Reservation” in the upper right corner and select “Cloud”
15.)On the “New Reservation – Cloud” page select the “Drop Down” dialog next to “Compute Resource” and select the “Amazon EC2” “Compute Resource”
16.) vCAC will “auto-generate” a “Name” for the “Reservation” however you can change the name if you like. The select the “Drop Down” dialog next to “Provisioning Group” and “Select” the “Provisioning Group” we created.
17.) Next if you like you can set a “Machine Quota” to limit the number of machines that can be provisioned on to this “Amazon AWS Reservation“. You must set a “Priority” for the “Reservation” which is used to assist in making placement decisions if you have multiple reservations. I will talk more about this in another post. Once you have set your “Priority” “click” the “Resources” tab above.
18.)”Amazon AWS” utilized “Key Pairs” for enhanced security of machine management tasks. You ave a few options within vCAC. You can let vCAC “Auto-generate a key pair per Provisioning Group“, “Auto-Generate a key pair per Machine“, or you can use a “Specific key pair” that you have already created through the “Amazon AWS” console. I’m going to use the “Auto-Generated per Provisioning Group” option in this example.
19.) Next we need to select the “Locations” within the “Selected AWS Region” that we want to make available for use. I’m going to select them all. Then we need to select the “Security Group” we would like to make our machine part of. The “Security Group” can be looked at as a firewall rules for your machine. I’m going to select my “Default” “Security Group“. Optionally you can select a “Load Balancer” to attach the machine to as well. I will cover this in a later article. When you are finished “Click” “Alerts” above.
20.) Here you can optionally enable “Alerts” that will send notifications if the “Reservation” is nearing capacity. Set the “Quota Threshold” for your alert, the email addresses to be notified, and the “Reminder Frequency” and click “Ok”
21.) You will now see your newly created “Reservation” listed on the “Reservations” screen. Now select “Global Blueprints” located under the “Enterprise Administrators” menu.
Creating a Blueprint
22.) Once you are on the “Global Blueprints” page “Hover” over “New Blueprint” and select “Cloud”
23.) Once on the “Blueprint Information” tab give your “Blueprint” a “Name“, and optionally change the “Display Icon“. Next assign it to a “Group(s)” and then optionally override the “Prefix” associated with this “Blueprint“. Then you can optionally set the max number of machines a user can request for this blueprint and a daily cost if you wish. Once complete select the “Build Information” tab above.
24.) On the “Build Information” tab change the “Blueprint Type” to “Server”
25.) Then next to “Amazon Machine Image” click the “Selection” box.
26.) Once the dialog box appears you can filer the results at the top to narrow the result for the AMI you would like to use. If you selected multiple regions for use make sure the AMI is in the Region you want to use. Select the “AMI” you would like to use and click “Ok”
27.) “Optionally” you can “override” the “key Pair” setting that we configured in the “Reservation“.
28.) “Optionally” you can “Enable” network options for the “Bluepeint“. The will allow the requester to select the “Security Group” they would like to apply to the machine if more than one was selected in the “Reservation“.
29.) Next select the “Instance Types” you would like the requester to be able to choose from.
30.) Then select the “Security” tab above.
Making a Request
31.) “Hover” over the newly created “Blueprint” on the “Global Blueprints” page and select “Request machine” to test our configuration. You can also go to the “Self Service” menu and select “Request Machine”
32.)On the “Confirm Machine Request” page click the “Drop Down” next to “Instance Type” and select the type of “Instance” you would like to request.
33.) Then click the “Drop Down” next to “Provision Into” and select “Non-VPC Location” because we do not have a “VPC” configured.
34.) Next select the “Drop Down” next to “Location” and select a location to provision to.
35.) Next click the “Storage” tab above.
36.) Optionally you can add “EBS Storage” volumes to your “Request“. Click the “Network” tab above.
37.) “Optionally” if you added more than one “Security Group” to your “Reservation” and “Enabled” “Network Options” in the “Blueprint” you can select a different “Security Group” for your machine. Click “Ok” when finished.
38.) Next under the “Self-Service” menu select “My Machines” to track the status of your request.
39.) Your newly “Requested” machine will appear under “My Machines” and the status will show “Requested“. Note: If you machine does not show up click refresh as it can take a few seconds for it to appear.
40.) If you continue to “Refresh” the page you will see the requests updated “Status“. The next “Status” your “Request” will go to is “CloudProvisioning“.
41.) After your request goes to “CloudProvisioning” If you login to your “AWS Console” and go to “AWS Management Console“, then “EC2“, and then “Instances” you will see your newly provisioned machine in the “Pending State”
42.) Once finished the machine state in “vCAC” will go to “MachineProvisioned“, Then “Turning On“, and finally “On”
43.) You will now see your machine “Running” in the “AWS Console“.
44.) In “vCAC” if you “Hover” over your newly created machine you will see the “Machine Options Menu” select “Edit”
45.) On the “Machine Information” tab near the bottom you will see “Admin Password“. Here you can show the “Local Password” for your newly provisioned “Amazon AWS Instance” Click the “Storage” tab above. Note: It can take Amazon 30+ minutes to make the password available even through the AWS Console. Once it is available from Amazon, it will not be available in vCAC until vCAC performs a data collection.
46.) On the “Storage” tab you can add “EBS” storage “post-proviosioning” if you would like. Click on the “Network” tab above.
47.) On the “Network” tab you can assign an “Elastic IP Address” if you have made them available through “Amazon AWS“. You can also change the “Security Group” and assign the machine to a “Load Balancer” Click “Ok” when you are done. More on these option soon.
There are a few important things to note. If you add additional services such as Elastic IP Address, Elastic Block Storage, Elastic Load Balancers, Sucurity Groups, etc through the Amazon AWS Console they will not appear as available in vCAC until after the next Inventory Data Collection. You can perform a manual data collection as well as change the data collection frequency by doing the following:
- Go to “Enterprise Administrator” menu and select “Compute Resources“
- Hover over the “Compute Resource” and select “Data Collection“
- Under the “Inventory” section you can set the “Frequency” in hours as well as manually “Request” a “Data Collection“.
- If you “Request” a “Data Collection” you can select “Refresh” at the bottom of the page to get the status of the collection.
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness,...
Mar. 4, 2015 09:45 AM EST Reads: 3,325
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Mar. 4, 2015 09:45 AM EST Reads: 2,397
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Mar. 4, 2015 09:30 AM EST Reads: 1,732
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Mar. 4, 2015 09:00 AM EST Reads: 2,827
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
Mar. 4, 2015 09:00 AM EST Reads: 1,496
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Mar. 4, 2015 09:00 AM EST Reads: 1,268
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
Mar. 4, 2015 09:00 AM EST Reads: 1,418
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Mar. 4, 2015 05:00 AM EST Reads: 2,666
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Mar. 4, 2015 04:00 AM EST Reads: 1,105
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
Mar. 4, 2015 02:45 AM EST Reads: 3,830
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Mar. 4, 2015 02:30 AM EST Reads: 1,197
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Mar. 4, 2015 02:00 AM EST Reads: 3,128
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Mar. 4, 2015 01:00 AM EST Reads: 4,453
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Mar. 4, 2015 12:30 AM EST Reads: 3,563
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Mar. 3, 2015 11:15 PM EST Reads: 773
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures...
Mar. 3, 2015 09:15 PM EST Reads: 908
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Mar. 3, 2015 05:00 PM EST Reads: 976
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Mar. 3, 2015 04:15 PM EST Reads: 938
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Mar. 3, 2015 04:00 PM EST Reads: 1,449
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Mar. 3, 2015 04:00 PM EST Reads: 919