Welcome!

@DXWorldExpo Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski, William Schmarzo

Related Topics: @DXWorldExpo, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, Apache, Cloud Security

@DXWorldExpo: Article

Big Dollars from Big Data

How to reduce costs and increase performance in the data center

Cloud computing has given birth to a broad range of online services. To maintain a competitive edge, service providers are taking a closer look at their Big Data storage infrastructure in an earnest attempt to improve performance and reduce costs.

Large enterprises hosting their own cloud servers are seeking ways to scale and improve performance while maintaining or lowering expenditures. If the status quo of scaling users and storage infrastructure is upheld, it will become increasingly difficult to maintain low cost cloud services, such as online account management or data storage. Service providers will face higher energy consumption in their data centers overall, and many are loath to begin charging for online account access.

Costs vs. Benefits
In response to the trend of growing online account activity, many service providers are transitioning their data centers to a centralized environment whereby data is stored in a single location and made accessible from any location via the Internet. Centralizing the equipment enables service providers to keep costs down while delivering improved Internet connections to their online users and realizing gains in performance and reliability.

Yet with these additional performance improvements, scalability becomes more arduous and cost-prohibitive. Improving functionality within a centralized data center requires the purchase of additional high-performance, specialized equipment, boosting costs and energy consumption that are challenging to control at scale. In an economy where large organizations are seeking cost-cutting measures from every angle, these added expenses are unacceptable.

More Servers, More Problems?
Once a telco moves into providing cloud-based services for its users, such as online account access and management, the demands on its data centers spike dramatically. While the typical employee user of a telco's or service provider's internal network requires high performance, these systems normally have fewer users and can access files directly through the network. Additionally, employees are typically accessing, sending and saving relatively low-volume files like documents and spreadsheets, using less storage capacity and alleviating performance load.

Outside the internal network environment, however, the service provider's cloud servers are being accessed simultaneously over the Internet by more users, which itself ends up becoming a performance bottleneck. Providers, telcos and other large enterprises offering cloud services therefore not only have to scale their storage systems to each additional user, but must also sustain performance across the combined users. Due to the significantly higher number of users utilizing online account tools at any given time, cloud users place a greater strain on data center resources.

Combining Best Practices
To remain competitive, cloud service providers must find a way to scale rapidly to accommodate the proliferating demand for more data storage. Service providers seeking data storage options should look for an optimal combination of performance, scalability and cost-effectiveness. The following best practices can help maximize data center ROI in an era of IT cutbacks:

  1. Pick commodity components: Low-energy hardware can make good business sense. Commodity hardware not only costs less, but also uses far less energy. This significantly reduces both setup and operating costs in one move.
  2. Look for distributed storage: Distributed storage presents the best way to build at scale even though the data center trend has been moving toward centralization. This is because there are now ways to increase performance at the software level that counterbalances the performance advantage of a centralized data storage approach.
  3. Avoid bottlenecks at all costs: A single point of entry becomes a performance bottleneck very easily. Adding caches to alleviate the bottleneck, as most data center architectures presently do, add cost and complexity to a system very quickly. On the other hand, a horizontally scalable system that distributes data among all nodes delivers a high level of redundancy.

Conclusion
Big Data storage consists mainly of high performance, vertically scaled storage systems. Since these current architectures can only scale to a single petabyte and are expensive, they are not cost-effective or sustainable in the long run. Moving to a horizontally scaled data storage model that distributes data evenly onto low-energy hardware can reduce costs and increase performance in the Cloud. With these insights, providers of cloud services can take steps to improve the performance, scalability and efficiency of their data storage centers.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


DXWorldEXPO Digital Transformation Stories
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of ...
DXWorldEXPO LLC, the producer of the world's most influential technology conferences and trade shows has announced the 22nd International CloudEXPO | DXWorldEXPO "Early Bird Registration" is now open. Register for Full Conference "Gold Pass" ▸ Here (Expo Hall ▸ Here)
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busine...
The revocation of Safe Harbor has radically affected data sovereignty strategy in the cloud. In his session at 17th Cloud Expo, Jeff Miller, Product Management at Cavirin Systems, discussed how to assess these changes across your own cloud strategy, and how you can mitigate risks previously covered under the agreement.
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and sh...
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.