|By Kevin Remde||
|January 13, 2013 10:30 AM EST||
“We get it, Kevin.”
And you’ve seen excellent articles in this series already, describing how to use the Windows Azure portal to create your virtual machines, how to upload your own VM hard disks into the cloud and use them to build machines, and more. In today’s installment, I’m going to show you how easy it is to connect App Controller (a component of System Center 2012) to your Windows Azure account, and then how to use App Controller to create virtual machines in your Windows Azure cloud.
To do this, we need to have a few preliminaries in place:
- You have a Windows Azure subscription, and have requested the ability to preview the use of Windows Azure virtual machines. (If you don’t have an account, you can start a free 90-day trial HERE.)
- You have System Center 2012 App Controller installed. (Download the System Center 2012 Private Cloud evaluation software HERE.)
NOTE: You will need System Center 2012 SP1 App Controller, which at the time of this writing is available to TechNet and MSDN subscribers and volume license customers only; but will very soon be generally available. I will update this blog post as soon as that happens.
So, with nothing more assumed then just those basics, let’s walk through the following steps:
- Connect App Controller to your Windows Azure subscription (READ THIS POST for the instructions on how to do this.)
- Create a Storage Account in Windows Azure
- Use App Controller to create a new Virtual Machine
Assuming you’ve done part 1, and have your connection to your Windows Azure subscription set up in App Controller, let’s move on.
Create a Storage Account in Windows Azure
There are many ways to create a new storage account:
- I could use the Windows Azure administrative portal
- I could use PowerShell for Windows Azure and the New-AzureStorageAccount cmdlet
- Or I could do it using App Controller.
For our purposes, let’s use App Controller.
Open App Controller and login as your administrative account. On the left, select Library.
Click Create Storage Account. Give your storage account a name, and choose a region or an affinity group.
Click OK. You should see something that looks like this at the bottom-right of the browser window:
After a few minutes, a refresh of the Library page should show you that you now have your new storage account available.
Now we need to create a container to hold our machine disk(s). With your new storage account selected, Click Create Container.
Give your container a name and click OK.
In a very short while, you’ll see your new container.
Now we’re ready to create virtual machines.
Use App Controller to create a new Virtual Machine
Open App Controller and login as your administrative account.
On the left, select Virtual Machines. This is where we can see, manage, and create new virtual machine and service deployments. (If you’re doing this for the first time, you won’t see items in your list here just yet.)
Click Deploy. The New Deployment window opens up.
Under Cloud, click Configure…, then select your Windows Azure connection as the cloud into which you’re going to deploy your new virtual machine.
(Note: In my App Controller, I’ve also connected to a local VMM Server, which is why I see this other cloud in my list.)
Now you will see this:
Click Select an Item… under Deployment Type. Now you’ll see a screen that looks something like this:
This is where you can choose to build a new machine or service based on existing, provided images, or images or disks you’ve uploaded into your own Windows Azure storage. In this example, I’m going to select Images on the left, and choose to build a new Windows Server 2012 machine using the provided image.
Once I click OK, I now see this:
So the next thing I need to do is click Configure… under Cloud Service. Virtual machines and services all run in the context of cloud services. For our example, we’re going to assume that you haven’t created any machines or other items that requires a service, so your list is going to be empty. You’ll use this screen to create and then select your new service.
Click Create… and then fill in cloud service details (Name, Description) and the cloud service location (a unique public URL, plus a geographic region or affinity group).
Next we need to configure the deployment:
Click Configure… under Deployment. Now you’ll see this:
Enter a deployment name, and optionally associate your machine with a virtual network if you have one. (If you don’t have, or don’t select a network, you will be creating the machine and service to handle networking within the service automatically.) Click OK.
Now it’s time to configure the virtual machine itself.
Click Configure… under Virtual Machine.
Now we set the general properties…
Note: an Availability Set is not required, but a new one can be created or an existing one selected from here.
Set the Disks…
When I click Browse…, I’m given the ability to choose the location for my disks in Windows Azure storage, as well as to add (or create) additional data disks for this machine. For our example let’s use the storage account and container we created earlier. I won’t be adding any data disks.
For the Network…
…I’ll just leave the default. I could use this opportunity to define additional endpoints for connections to services on this machine, or I could do it later.
For Administrator password…
…enter a password for the local administrator account. (It also looks like you can use this to assign the computer to a domain if you happen to have a domain controller in the same network or service. I haven’t yet tried, this, so I can’t comment further.)
And now click Deploy.
You’ll see a notification towards the bottom right that should look something like this:
And after several minutes, looking in the Virtual Machines area of App Controller, you will see your new machine appear. Its status will change to “provisioning”, and eventually “running”.
Notice also that if you select your new machine, you also have the option now to connect to it via Remote Desktop! (Cool!) Log in as the Administrator with the administrator password you assigned, and you’re in!
Naturally, you can very easily use App Controller to delete your machines, disks, storage containers, and storage accounts, too. (Remember to do that when you’re done. Even if a machine isn’t running, you’re still being billed for it and for the storage being used!)
Useful stuff? I hope so. Let me know in the comments if you have any questions or… comments.
And if you missed any of the other parts of our series, you can find the entire list HERE.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
Jan. 31, 2015 05:45 AM EST Reads: 3,195
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
Jan. 31, 2015 03:00 AM EST Reads: 3,498
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Jan. 31, 2015 02:00 AM EST Reads: 8,092 Replies: 1
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
Jan. 31, 2015 01:00 AM EST Reads: 2,951
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP ...
Jan. 31, 2015 01:00 AM EST Reads: 2,846
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Jan. 31, 2015 12:30 AM EST Reads: 3,096
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
Jan. 30, 2015 11:00 PM EST Reads: 3,764
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
Jan. 30, 2015 10:00 PM EST Reads: 2,884
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the ...
Jan. 30, 2015 03:45 PM EST Reads: 3,156
Agility is top of mind for Cloud/Service providers and Enterprises alike. Policy Driven Data Center provides a policy model for application deployment by decoupling application needs from the underlying infrastructure primitives. In his session at 15th Cloud Expo, David Klebanov, a Technical Solutions Architect with Cisco Systems, discussed how it differentiates from the software-defined top-down control by offering a declarative approach to allow faster and simpler application deployment. Davi...
Jan. 30, 2015 03:30 PM EST Reads: 3,445
The adoption of the Internet Of Things (IoT) is growing and its growth is synonymous with the growth of cloud. As per predictions from IDC: IoT and the Cloud: Within the next five years, more than 90% of all IoT data will be hosted on service provider platforms as cloud computing reduces the complexity of supporting IoT "Data Blending." This means that any organization that wanted to transform themselves using IoT has to automatically embrace the cloud too, especially the public cloud. This b...
Jan. 30, 2015 03:15 PM EST Reads: 675
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
Jan. 30, 2015 03:15 PM EST Reads: 3,538
“We help people build clusters, in the classical sense of the cluster. We help people put a full stack on top of every single one of those machines. We do the full bare metal install," explained Greg Bruno, Vice President of Engineering and co-founder of StackIQ, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 30, 2015 02:45 PM EST Reads: 2,549
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 30, 2015 02:30 PM EST Reads: 2,721
In this demo at 15th Cloud Expo, John Meza, Product Engineer at Esri, showed how Esri products hook into Hadoop cluster to allow you to do spatial analysis on the spatial data within your cluster, and he demonstrated rendering from a data center with ArcGIS Pro, a new product that has a brand new rendering engine.
Jan. 30, 2015 02:30 PM EST Reads: 1,780
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Jan. 30, 2015 02:15 PM EST Reads: 1,674
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your onlin...
Jan. 30, 2015 02:15 PM EST Reads: 3,248
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore's Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at Big Data Expo, Mason Katz, CTO and co-founder of StackIQ, disc...
Jan. 30, 2015 02:00 PM EST Reads: 3,253
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
Jan. 30, 2015 02:00 PM EST Reads: 2,407
Can the spatial component of your Big Data be harnessed and visualized, adding another dimension of power and analytics to your data? In his session at Big Data Expo®, John Meza, Product Engineer and Performance Engineering Team Lead at Esri, discussed the spatial queries that can be used within the Hadoop ecosystem and their integration with GeoSpatial applications. The GIS Tools for Hadoop project was also discussed and its implementation to discover location-based patterns and relationships...
Jan. 30, 2015 01:45 PM EST Reads: 1,704