Click here to close now.

Welcome!

Big Data Journal Authors: Lori MacVittie, Elizabeth White, Carmen Gonzalez, Ian Khan, Harry Trott

News Feed Item

Fujitsu ICT Boosts Business Continuity for Kyoto University

On-campus cloud enables improved operational efficiency and convenience

Tokyo, Jan 10, 2013 - (JCN Newswire) - Fujitsu and Kyoto University today announced that they have worked together to virtualize 128 servers and build an on-campus private cloud system aimed at strengthening business continuity plan (BCP) capabilities for the university's all-purpose server system and optimization of ICT investments. The new system began operations on December 28, 2012. In addition, the university plans to proceed with the construction and operation of a BCP site for the new system using Fujitsu's datacenter in eastern Japan.

As a result of this project, it will be possible to maintain availability of key services on the university's homepage and other servers even during natural disasters or blackouts in the Kansai region. Operated and administered by Kyoto University's Institute for Information Management and Communication and the Academic Center for Computing and Media Studies, the new cloud environment allows over 400 virtual servers to be operated, thereby enabling the consolidation of general-purpose servers that have until now been independently operated throughout the campus. This will help to streamline server operations and management while delivering increased convenience to faculty and researchers.

In leveraging the new system environment, Kyoto University plans to further optimize its ICT investments and cultivate an even more cutting-edge and supportive environment for nurturing human resources and promoting research and development.

Background to the Deployment

After undergoing partial privatization in 2004, national universities in Japan have had to adjust to a host of new organizational and environmental changes, including a need to communicate research results and educational curriculum to audiences both within and outside of the universities. Moreover, in both academic and research settings, it has become essential to be able to transmit a wide range of large-capacity content across networks.

With these trends gaining traction, Kyoto University established the Institute for Information Management and Communication in 2005 in order to facilitate the planning, development, management and operation of the university-wide IT infrastructure, provide a variety of user services related to the IT infrastructure, develop and provide an advanced and secure information environment, and develop personnel capable of advanced use of information and information technologies. The Institute is comprised of the Academic Center for Computing and Media Studies, which is engaged in activities related to research, development and educational support, and which has explored ways to ensure the stability of its information communications and seamless content transmissions.

During this process, the university looked to augment the number of servers and the processing / storage capacity of the all-purpose server system employed for websites and repositories of academic information, as well as for the sharing of content between laboratories. The aim was to build an ICT environment that is equipped to handle natural disasters and power shortages. In developing an ICT environment, the university also faced the important challenges of optimizing its ICT investments and streamlining administrative workloads.

Features and Results of the New System

1. Server virtualization and consolidation

The private cloud environment consists of a total of 128 Fujitsu PRIMERGY CX400 S1 and PRIMERGY CX250 S1 multi-node servers, which feature exceptional energy efficiency and a compact footprint, making it possible to run over 400 virtual servers. Using the new private cloud environment, Kyoto University plans to virtually consolidate servers across the campus as part of an effort to further increase the efficiency of campus-wide server management and reduce power consumption.

2. Ensures operational continuity of critical servers

By sharing duplicate versions of essential data from some of the virtual servers between Kyoto University and a BCP site, it is possible to switch operations to a backup server hosted at Fujitsu's datacenter during emergencies. This, in turn, will enable continued operations of critical servers that provide various services on the university's homepage, even during natural disasters or blackouts on campus.

3. Campus-wide faculty e-mail system now available 24/7

Previously, access to the faculty e-mail system was impacted by the working hours of IT administrators and statutory inspections. By building the e-mail system using Fujitsu's datacenter and outsourcing its operations, it will now be available to faculty 24/7.

4. Automatically backs up critical data

Fujitsu ETERNUS NR1000F series storage systems have been installed in both the on-campus private cloud environment and the BCP site. Using the storage system's SnapMirror functionality, new data is periodically transferred between the on-campus system and the BCP site, allowing critical data to be automatically and remotely backed up.

5. Employs Fujitsu datacenter

The system uses a Fujitsu datacenter that is located in an area of eastern Japan where the risk of damage from natural disasters is low. The datacenter is equipped with the latest disaster mitigation measures to deliver safety, reliability, and efficiency, as well as 24/7 system operations. This has resulted in superior reliability, security, and availability for the campus-wide faculty email system and BCP site, while also optimizing ICT resources.

6. Uses the SINET4 academic information network

The system uses the SINET4 Science Information Network* for communications between Kyoto University and the Fujitsu datacenter, resulting in a network that is more reliable and easier to use, while making effective use of ICT resources.

* SINET4 Science Information Network:
An information and communication network connecting universities and research institutions throughout Japan constructed and operated by the National Institute of Informatics.

About Fujitsu Limited

Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Over 170,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE:6702) reported consolidated revenues of 4.5 trillion yen (US$54 billion) for the fiscal year ended March 31, 2012. For more information, please see www.fujitsu.com.



Source: Fujitsu Limited

Contact:
Fujitsu Limited
Public and Investor Relations
www.fujitsu.com/global/news/contacts/
+81-3-3215-5259


Copyright 2013 JCN Newswire. All rights reserved. www.japancorp.net

More Stories By JCN Newswire

Copyright 2008 JCN Newswire. All rights reserved. Republication or redistribution of JCN Newswire content is expressly prohibited without the prior written consent of JCN Newswire. JCN Newswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@BigDataExpo Stories
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.