|By Lori MacVittie||
|June 30, 2014 09:00 AM EDT||
Inarguably one of the drivers of software-defined architectures (cloud, SDDC, and SDN) as well as movements like DevOps is the complexity inherent in today's data center networks. For years now we've added applications and services, and responded to new threats and requirements from the business with new boxes and new capabilities. All of them cobbled together using traditional networking principles that adhere to providing reliability and scale through redundancy.
The result is complex, hard to manage, and even more difficult to change at a moments notice.
Emerging architectural models based solely on cloud computing models or as part of larger, software-defined initiatives, attempt to resolve this issue by introducing abstraction and programmability. To get around the reality that deploying new services in a timely manner takes days if not weeks or even months, we figure that by moving to a programmatic, software-based model we can become more efficient.
Except we aren't becoming more efficient, we're just doing what we've always done. We're just doing it faster. We're not eliminating complexity, we're getting around it by adding a layer of scripts and integration designed to make us forget just how incredibly complex our networks really are.
One of the primary reasons our networks are the way they are is that we're reactive.
What we've been doing for years now is just reacting to events. Threats, new applications, new requirements - all these events inevitably wind up with IT deploying yet another "middle box." A self-contained appliance - hardware or software - that does X. Protects against X, improves Y, enhances Z. And then something else happens and we do it again. And again. And ... you get the point. We react and the result is an increasingly complex topological nightmare we call the data center network.
What we need to do is find a better model, a strategic model that enables us to deploy those solutions that protect against X, improve Y and enhance Z without adding complexity and increasing the already confusing topology in the network. We need to break out of our tactical mode and start thinking strategically so we can transform IT to be what it needs to be to align IT results with business expectations.
That means we need to start thinking platform, not product.
Platform is Strategic. Product is Tactical.
We know that the number of services actually in use in the data center has been increasing in response to all the technological shifts caused by trends like security, cloud and mobility. We’ve talked to customers that have more than 20 different services (and vendors) delivering services critical to the security, performance and reliability of applications. Every time a new threat or a new trend impacts the data center, we respond with a new service.
That’s one of the reasons you rarely see a detailed architectural diagram at the application flow level – because every single interaction with a customer, partner or employee can have its own unique flow and that flow traverses a variety of services depending on the user, device, network and application and even business purpose.
That's the product way.
What we need to do is shift our attention to platforms, and leverage them to reduce complexity while at the same time solving problems - and doing so faster and more efficiently. That's one of the primary benefits of Synthesis.
Synthesis' High Performance Services Fabric is built by gluing together a platform - the ADC - using new scalability models (ScaleN). The platform is what enables organizations to deploy a wide variety of services but gain operational efficiencies from the fact that the underlying platform is the same. F5 Software Defined Application Services (SDAS) are all deployable on the same, operationally consistent platform regardless of where it might physically reside. Cloud, virtual machine or hardware makes no difference. It's the platform that brings consistency to the table and enables rapid provisioning of new services that protect X, improve Y and enhance Z.
In the past year we've brought a number of new services to the Synthesis architecture including Cloud Identity Federation, Web Anti-Fraud, Mobile optimizations and a Secure Web Gateway. All these services were immediately deployable on the existing platform that comprises the Synthesis High Performance Services Fabric. As we add new capabilities and services, they, too, are deployable on the same platform, in the same fabric-based approach and immediately gain all the benefits that come from the platform: massive scalability, high performance, reliability and hardened security.
A platform approach means you can realize a level of peace of mind about the future and what might crop up next. Whether it's a new business requirement or a new threat, using a platform approach means no more shoehorning a new box into the topology. It means being able to take advantage of operational consistency across cloud and on-premise deployments. It means being able to expand capabilities without needing to expand budgets to support new training, new services, and new contracts.
A platform approach to service deployment in data center networks is strategic. And with the constant rate of change headed our way thanks to the Internet of Things and mobility, the one thing we can't afford to to go without is a sound strategy for dealing with the technological ramifications on the network.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Oct. 9, 2015 07:45 PM EDT Reads: 144
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
Oct. 9, 2015 07:30 PM EDT Reads: 302
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driv...
Oct. 9, 2015 07:30 PM EDT Reads: 121
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Oct. 9, 2015 07:00 PM EDT Reads: 922
Achim Weiss is Chief Executive Officer and co-founder of ProfitBricks. In 1995, he broke off his studies to co-found the web hosting company "Schlund+Partner." The company "Schlund+Partner" later became the 1&1 web hosting product line. From 1995 to 2008, he was the technical director for several important projects: the largest web hosting platform in the world, the second largest DSL platform, a video on-demand delivery network, the largest eMail backend in Europe, and a universal billing syste...
Oct. 9, 2015 06:45 PM EDT Reads: 211
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line lo...
Oct. 9, 2015 06:30 PM EDT
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Oct. 9, 2015 06:00 PM EDT Reads: 310
Secure Cloud through Automated Compliance | @CloudExpo @CloudRaxak #Cloud #BigData #DevOps #Microservices
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Oct. 9, 2015 06:00 PM EDT Reads: 322
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends an...
Oct. 9, 2015 05:30 PM EDT Reads: 104
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Oct. 9, 2015 04:00 PM EDT Reads: 242
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 9, 2015 03:45 PM EDT Reads: 509
In recent years, at least 40% of companies using cloud applications have experienced data loss. One of the best prevention against cloud data loss is backing up your cloud data. In his General Session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how organizations can use eFolder Cloudfinder to automate backups of cloud application data. He will also demonstrate how easy it is to search and restore cloud application data using Cloudfinder.
Oct. 9, 2015 03:00 PM EDT Reads: 502
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
Oct. 9, 2015 03:00 PM EDT Reads: 211
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
Oct. 9, 2015 03:00 PM EDT Reads: 364
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innov...
Oct. 9, 2015 02:15 PM EDT Reads: 214
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll sha...
Oct. 9, 2015 01:45 PM EDT Reads: 564
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
Oct. 9, 2015 01:15 PM EDT
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
Oct. 9, 2015 01:00 PM EDT Reads: 801
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
Oct. 9, 2015 01:00 PM EDT Reads: 329
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Oct. 9, 2015 12:00 PM EDT Reads: 740