Click here to close now.

Welcome!

Big Data Journal Authors: Kevin Benedict, Harry Trott, Bart Copeland, Pat Romanski, Ian Khan

Related Topics: Cloud Expo, SOA & WOA, .NET, Virtualization, Silverlight, Big Data Journal

Cloud Expo: Article

A Hybrid Computing Model for SharePoint

Seven critical success factors

For many companies, the business benefits that cloud computing promises are too compelling too ignore: improved agility, lower costs, better resource allocation, and fewer operational issues. As a result, organizations have been moving commodity infrastructure and services to cloud-based services managed by some of the world's leading technology companies - including Office 365, Microsoft's primary offering for business productivity in the cloud.

Several new developments are making Office 365 even more enticing:

  • New release of SharePoint: Extranets and public-facing websites are expensive in SharePoint 2010. However, with pricing changes and new web content management (WCM) functionality for SharePoint 2013, many organizations are beginning to take a second look at the cloud for some work streams as Office 365 gets updated in early 2013 with the latest SharePoint version.
  • Broad Office 365 adoption: According to Kurt DelBene, President of the Office Division at Microsoft, Office 365 is on target to become the fastest-selling server product in Microsoft's history, outpacing all analyst expectations.
  • Additional cost savings: With Office 365, organizations pay a monthly fee per user and gain access to ongoing maintenance and expertise to manage servers. That saves them from a huge, upfront operating expense. (We should point out, however, that because SharePoint Online is more of a product and service than a platform, it has more limited capabilities than the on-premises version, so long-term cost implications are not yet known.)

Given these gains, no company should ignore a move to the cloud. However, a full jump to the public cloud without careful consideration is ill advised. Some companies can't move everything to the cloud because they have compliance, regulatory, or government restrictions that limit where data can be stored and who can have access to it. But many companies shouldn't move everything to the cloud, because there is simply not parity between online and on-premise versions of SharePoint. What makes SharePoint compelling for many enterprises is the ability to extend, customize, and integrate with other enterprise systems, much of which is impossible with the Office 365 platform. Until there is parity, certain workstreams should stay in their current environments.

What's needed now is a thoughtful, strategic approach to cloud computing that is based on the needs of your business. Understanding which aspects of your organization's business systems can be moved, and should be moved, to the cloud is an important discussion that leaders must undertake. Although some work streams can be moved easily, many others require customization and management that only an on-premise deployment can support. That's why a hybrid model - comprising on-premise, private, and/or public cloud components - offers organizations the best way to ease into the cloud computing paradigm and leverage the promise of SharePoint Online.

Where to start? As part of readiness planning for cloud services adoption, companies must address seven critical success factors:

1. What are the business requirements?
As a first step, organizations must get their arms around the underlying business requirements of the environment, the key use cases, and the key work streams. For example, it may be possible to build out a lightweight customer relationship management solution, acting as a portal for sales, marketing, and support to connect with customers. But it may not make sense to move development team activities to the cloud due to integration issues with case management or configuration management systems. Outline your key workstreams, and then begin to map each workstream to your on-premise and online models to see which activities can be improved upon.

2. What are the business implications?
Companies must understand the implications of moving each use case and each work stream into the cloud. You must know the answers to these key questions: Can current functionality, security, audits, and reporting be replicated in the cloud? If key functions cannot be offered and supported, what are the risks? Say you have a ticketing system, with SharePoint acting as the front end. Without a full understanding of the architecture of the solution and how data is shared between the ticketing platform and SharePoint it may be difficult to understand the true cost implications of moving to the cloud. You also need an understanding of the performance and cost impacts to the large number of web service calls the platform may make within a pure cloud environment. Depending on the volume of data moved, how it is moved, and the timeframe for moving this important business system to the cloud, it may make financial sense to maintain an on-premise version of SharePoint for your product and support organizations.

3. What are the management ramifications?
Companies must understand the management implications of each work stream. It's not just a matter of "can we move it?" but "should we move it?" In some instances, a move to the cloud may result in added administrative effort and costs. Case in point: In one of the most common hybrid scenarios, a business that uses Office 365 as its extranet while maintaining an on-premise or dedicated cloud SharePoint environment may find that managing permissions, storage, and usage/activity is an extremely manual and time-consuming process. That's because there are no native tools for managing these functions across disparate SharePoint environments. Therefore, it's critical to look at your current metrics and KPIs for managing SharePoint, and understand the implications of duplicating these metrics within a new cloud environment - as well as combining and normalizing this data across all systems.

4. What about the end user experience?
We can't state this strongly enough: Companies must understand what the end-user experience will look like. If a hybrid environment adds effort or decreases productivity for end users, what is the cost? Consider these factors: Access to your enterprise platforms probably begins with a single sign-on experience - you log in once to get to all the tools and systems you need to do your business tasks. Your organization may have made significant investments to brand your internal platforms and put processes in place to maintain consistency across team sites and business unit portals. But, if you add another external system to the mix, what happens to the end-user experience? If permissions are separate, how does that affect end-user productivity? Your imperative is to understand how your primary users will conduct their business, thinking about the end-to-end experience, not just whether key functionality is being met through the new system. Remember: the more difficult a system is to use, the less likely people will be to use it.

5. How will we move?
If part of your organization, and key work streams, are moved to the cloud, what is the plan for the move? Will you move all at once? In waves? What about training? Migration and onboarding strategies vary widely. Your strategy should be based on critical path business use cases, helping those who rely on the new system before the masses. One strategy is to follow the 80/20 rule: concentrate your planning around the 20% of the users who use SharePoint most heavily, giving them 80% of your time, while spending 20% of your time with the remaining (mostly casual) users. However you decide to time moving end users and work streams, you must involve end users as you formulate and communicate your plan. The more you involve people in the process, the more likely they are to support that process.

6. How will we measure success?
Companies must have the right metrics in place to track performance across the entire environment. Companies also need to think about whether or not content needs to be synchronized between environments, or if these use cases can be maintained separately. Most companies fail at this today - because they don't have sufficient visibility into their SharePoint environment to generate and track adequate metrics. Moving to a hybrid model is a great opportunity to correct this trend. One strategy is to begin by identifying three key metrics across both systems, and build from there. An example might be Top 10 Most Active Sites, Top 10 Most Active Users, and Most Active Content Databases. Based on this data, you will gain a much better understanding of who is actually using SharePoint and where, allowing you to better allocate your time and resources to support the sites and users who are most active on the platform.

7. How will we enforce governance?
Ask yourself: Do we have a defined change management process? Do we have our roles and accountabilities defined? Are we actively reviewing and taking action on new requests? Are we giving end users and admins visibility into the changes being made and the priorities of those requirements? Having a governance body in place is crucial. Without automation, manual governance practices (policies, documentation, metrics) need to be extended or duplicated, with appropriate roles defined and assigned. Best practices include running through the document lifecycle across environments and identifying where current policies break. Focus your attention on the governance policies that manage risk - compliance, regulations, retention, or any other legal requirements of a hybrid system. Then define what it will take to maintain minimum security levels, and create a plan for automating and simplifying.

Despite the risks, some companies may be drawn into the cloud by the perceived cost savings, despite their customization or integration needs. This is a recipe for failure. Companies that successfully make the move to a hybrid model are those that understand the business activities that can be offloaded to the cloud, benefiting from its scale and cost benefits.

The beauty of a strategic, hybrid model is that it's not "all or nothing." By addressing the seven critical success factors outlined above, companies will be taking a holistic approach versus making a blind technology decision. By focusing on specific workstreams, and only building out those workstreams that can be supported, your company will end up with a strategic, hybrid model that supports the needs of your business.

For more information:

More Stories By Christian Buckley

Christian Buckley serves as Director of Product Evangelism at Axceler. He is a Microsoft SharePoint MVP and an internationally recognized expert on collaboration platforms, social informatics, business analysis, and project management topics. As Axceler's evangelist, he drives product awareness, partner advocacy, and community development. He previously worked at Microsoft as part of the enterprise hosted SharePoint platform team (now part of Office365).

Prior to Microsoft, Christian was managing director of a regional consulting firm in the San Francisco East Bay, co-founded and sold two technology startups, and worked with some of the world’s largest hi-tech and manufacturing companies to build and deploy collaboration and supply chain solutions.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and ot...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-cri...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco S...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise be...
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...