Skip to main content

Fujitsu

Japan

Archived content

NOTE: this is an archived page and the content is likely to be out of date.

Abstracts of Magazine FUJITSU 2015-7 (Vol. 66, No. 4)

Special Issue: Leveraging Big Data to Achieve Innovation

Overview

  • Big Data: Trends and Fujitsu's Approaches

It has been a few years since the term big data became widely recognized, and big data is entering a full-blown diffusion phase. The Internet of Things (IoT)―the next major trend in information and communications technology (ICT)―is expected to boost the amount of data generated on a daily basis, and progress is being made in the development of hardware and software to handle and process big data at a high speed and low cost. Technological advances in areas such as machine learning and artificial intelligence are enhancing the analytics to leverage big data, which is deployed to help decision-making processes in business management and operations. The area of big data application is thus expanding from business to social innovations. This paper gives a general outline of the special issue on Leveraging Big Data to Achieve Innovation and summarizes it from the viewpoints of "data," "analytics" and "applications." By arranging the articles included in this issue in terms of these three themes, it also describes Fujitsu's latest projects in the field of big data.

State-of-the-art Technology to Support Big Data Utilization

  • Linked Open Data for Enhanced Use of Open Data and New Applications

In Japan, the Declaration to be the World's Most Advanced IT Nation (approved by the Cabinet in June 2014) designates the two-year period between 2014 and 2015 to be an intensive development phase in promoting the use of public data (open data) in the private sector. In line with this, an increasing number of bodies including governmental offices and local authorities are rolling out initiatives to leverage open data. Fujitsu Laboratories has been engaging in the development of technology to enhance the value of open data for some time. It focuses on the utilization of linked open data (LOD), which is recognized as 5-star open data, and their research results are partly made accessible to the general public. This paper introduces some of Fujitsu Laboratories' projects for the creation of an open data eco-system, including a Web-based LOD query service (LOD4ALL) and a Web tool to visualize region-specific information (EvaCva).

  • Challenge of Passing Mathematics Entrance Exam with Computer Algebra

Today's increasing interest in big data is backed by an aspiration for better ways of analyzing data to create new value. This creation of value with new knowledge gained through efficient data analysis is a source of renewed interest in component technologies of artificial intelligence (AI), such as natural language processing, image recognition and machine learning. While the intelligent tasks to advance AI continue to grow, big data is insufficient for some of them. In order to process data correctly even in cases where there is only a small volume of data available, new logic needs to be integrated into the existing AI technology. As an attempt to achieve this, there is a project to develop AI that is capable of automatically solving university entrance exam problems given in natural language, with a goal to achieve a level that exceeds the threshold required for admission to the University of Tokyo (Todai); thus the project is named "Todai Robot Project - Can a Robot Pass the University of Tokyo Entrance Exam?" Having long engaged in researching computer algebra and quantifier elimination (QE), powerful tools that can be applied to solve mathematics problems of entrance exams, Fujitsu Laboratories is taking part in the project's math team. This paper outlines an automatic solution scheme that deals with a university entrance exam in mathematics, and then it explains the technical challenges thereof. It also describes evaluations of the current system in terms of solving problems using real and mock exams.

  • Large-scale Security Log Analysis based on Outlier Extraction Technology

In recent years, attacks on cyber space have not only increased in number but also become aggressive and sophisticated. To take countermeasures against such attacks, security analysts must detect and analyze unrecognized tactical attack events and their intentions before the start of the main attacks. It is not sufficient to apply conventional methods such as rule-based attack detection and simple outlier detection. The tactical attack events are hidden among numerous known attack events and it is difficult to extract and analyze them with only conventional methods. Therefore, to solve this problem Fujitsu Laboratories has developed Outlier Extraction Technology in order to single out an outlier-structure (a cluster of data shared with rare values) from a large-scale discrete event log such as a security event log. By applying this outlier extraction technology to a network monitoring log, security analysts can detect and analyze a sequence of tactical attacks. In this paper, we present a case study of applying the technology to an intrusion detection system (IDS) log. In this case study, we successfully extract an unrecognized attack sequence with advanced strategies. Detailed analyses on one of the outlier structures conducted by a security analyst led to this finding. Furthermore, we also show a Computer Emergency Response Team (CERT) project that utilizes the outlier extraction technology and is applied to a real IDS log.

  • Technology for Speeding up Parallel Distributed Processing and Similarity Search to Support Utilization of Big Data

Several years have passed since the phrase "big data" began to become widespread. Utilization of big data was first introduced in consumer services but it has recently been spreading among enterprises. There have been an increasing number of cases in which data that could not be handled by the conventional information and communications technology (ICT) systems are processed for use in developing good-selling products and preventing accidents or hazards. Utilization of big data in enterprises, unlike that for consumer services, involves use of business data as the basic information to be combined with on-site data from sensors or images, which have now become available on site, or with external data, such as open data from social networking services or local governments, in order to obtain new knowledge. The basic requirements in that process include high-speed processing of a combination of multiple data series and retrieval of the necessary information from sensors and images. This paper presents Hadoop business data utilization technology, which accelerates batch processing of large volumes of business data in enterprises by means of distributed processing, and high-speed approximate similarity search technology that achieves a significant speed-up of similarity search of sensors and images compared with the conventional method.

State-of-the-art ICT Infrastructure to Support Big Data Utilization

  • Vertically Integrated Data Warehouse Platform for Age of IoT

Advancements in sensors and mobile devices are enabling us to obtain a large amount of data in real time. This is driving a trend in which data analysis goes from dealing with accumulated past data to handling real-time data. In utilizing real-time data, it is essential to be able to perform speedy analysis to facilitate quick action, and this is achieved through converting the voluminous data into forms that can be more easily comprehensible for humans, such as maps and drawings. This change in trends of analysis methods has brought about technological innovations in three areas―data analytics, query and data itself―represented by the diversification of analytical algorithms, growing varieties of intuitive visualization tools, and cutting-edge image recognition technology such as Deep Learning. Against this background, databases as an integral part of data analysis need to facilitate fast data-extraction and analysis, incorporating various new technologies while offering a compact storage with a large capacity to store diverse data. The vertically integrated data warehouse platform FUJITSU Integrated System PRIMEFLEX for Analytics can meet all the above requirements. This paper explains the platform and its realization of fast data extraction and analysis, high-speed mapping of sensor data, and the technology to generate information from unstructured data.

  • Big Data Analysis Platform Incorporating Cold Storage

Big data analysis involves the processing of a vast amount of data, and requires data storage with a gigantic capacity. Conventional means of data storage are becoming infeasible in terms of both data capacity and cost. As an alternative, "cold storage" is receiving increased attention. This is a large-capacity, low-cost storage which may serve as a final big data archive. Cold storage can facilitate long-term storage of large volumes of historical data, which would previously have been discarded, and it in turn enables reanalysis or redefinition of analytical rules, taking such data into account. Fujitsu is pursuing the development of a historical data search engine based on a big data analysis platform with cold storage incorporated. In this paper, we introduce commercial optical disks as an example of cold storage, and describe a case of their application in a platform to analyze big data.

  • IoT Platform for Utilizing New Big Data

The Internet of Things (IoT), an environment for connecting all things, people and events, provides a variety of experiences and offers decision-making support and new discoveries to people by allowing them to use gathered data as valuable information. In addition to the existing social big data, enterprises are starting to utilize big data based on this new IoT technology. Assumed conditions specific to IoT include the condition that large volumes of data are generated from numerous devices and sensors, and such data may contain pieces of data that do not provide useful information or require real-time processing and advanced security processing. In addition, these conditions may vary depending on factors of the external environment such as the time, period and event. To meet these conditions, it is important to distribute processing steps over various locations including endpoints, gateways and clouds, rather than performing all processing tasks in one location. It is also necessary to have a function to allow dynamic changes to be made depending on what processing needs to be performed, and where, according to the requirements and environment to achieve high efficiency. In the process, a distribution system based on a network connecting between individual components plays a key role.

Situations where Utilization of Big Data Opens Up New Businesses

  • Mighty Factory―Leveraging Big Data to Support Manufacturing Innovations On-site

Within Japan, manufacturers are faced with a multitude of issues, such as fast-ageing workers at their production bases, difficulties in passing on skilled workers' know-how, and changing production trends moving towards high-mix, low-volume production. This is resulting in over-production of work in process, compromised product quality and a high incidence of facility breakdowns. In view of these circumstances, we reorganized our knowledge gained through giving support to our customers in manufacturing, and created a visualized representation of ideal manufacturing that drew on big data, "the Mighty Factory," rendered using technology to analyze big data. This will lead manufacturers not only towards solutions for the above-mentioned problems, but also into proactive investments. This paper explains the challenges our customers in the manufacturing industry are faced with, and describes the background against which the Mighty Factory came into being. We will also touch upon the individual component functions of the Mighty Factory as well as tasks for the future.

  • Workshop Method Leading to Value Creation by Data Utilization

Utilization of information and communications technology (ICT) such as E-commerce, omni-channeling and the Internet of Things (IoT) has computerized corporate activities themselves and an environment for utilizing external data by promoting open governments and using other means is being gradually established. In this situation, there are increasing expectations that new value will be created via big data analysis that makes use of large-scale data handling technologies and advanced statistical techniques. In reality, however, it is not easy to discover new value that can be used for business simply by analyzing data. In order to link data utilization with value creation, it is important to understand approaches to data utilization that best suit the purposes before they are studied in workshops. This paper first classifies the types of data utilization approaches that contribute to value creation and shows key points for successful workshops by utilizing data. Then, it presents the procedure for implementing the "data utilization-based value creation workshops" proposed by Fujitsu and the workshop methodologies with systematized measures for clarifying the purposes and effects of data utilization.

  • Data Curation Service to Support Operational Improvement and New Business Creation Using Big Data

Making use of big data in business is a big challenge for many enterprises today. Meanwhile, artificial intelligence (AI) has been making remarkable progress for the last few years with ever-increasing amounts of data, and the technology to derive knowledge or patterns from it. Fujitsu's data scientists, called Curators, apply the same things to a business; that is, they apply the knowledge-deriving technology to big data in business. They are developing data usage models focusing on using AI for specific purposes, such as predicting the number of customers when a store newly opens, and predicting what kind of faults could occur with a machine, and when. These activities can change the role of ICT in business so that it serves to create operational innovation and new business from the accumulated log data, using the computational power available today. This paper explains what the difficulties in exploiting the Curators' efforts for utilizing big data such as their analysis framework and exploratory approach.

  • Internal Initiatives Aiming to Realize "Smart Monozukuri"

At Fujitsu, we are enhancing our manufacturing environment through the Fujitsu Production System (FJPS). We introduced the Toyota Production System (TPS) to promote innovative manufacturing and expanded it into the development department, where it ultimately evolved into the FJPS. One of the most important and long-term tasks is to use our own the information and communications technology (ICT) in our manufacturing and development activities. This paper describes Fujitsu's next-generation manufacturing concept "Smart Monozukuri" based on ICT, big data and the Internet of Things (IoT), and explains various initiatives to realize our concept. We aim to support the manufacturing industry, and help the companies in it to become more competitive in the global market by offering them the know-how, methods and tools gained through our internal initiatives.

  • Marketing Innovation to Grasp Real-world Consumer Behavior: Case of Innovation for Food Industry

In the early days when the term big data was still new to many, people's attention was drawn to successful cases of using such data that pointed to the arrival of a new era. Only analytical specialists, so-called data scientists, could bring about those successful cases. Time has passed and today, big data is more widespread, being leveraged at the business user level for people in sales, manufacturing and other divisions for purposes such as improvement of in-the-field performance, front-line reform and task optimization. Fujitsu offers an integrated solution for data management and analysis, so that business users who do not possess in-depth knowledge of data analysis or of information and communications technology (ICT) systems, can meaningfully use big data themselves from a field-centered perspective. This paper describes a project on Fujitsu's integrated solution for leveraging big data, based on a case in the context of marketing innovation for the food industry. The solution facilitates highly accurate marketing analyses that account for characteristics of the areas, shops, products or consumer clientele, drawing on various data related to consumer behaviors, such as purchase and market data as well as other statistics.

  • Big Data Utilization for Enriching Lives of Pets and Their Owners: Animal Cloud

Fujitsu pursues the realization of social innovations designed to solve challenges in society and enrich lives of people through information and communications technology (ICT). One such pursuit is Animal Cloud, which aims to improve the lives of pet owners and their pets. This paper describes a case of big data utilization in the Animal Could project. Pet owners are increasingly demanding better veterinary services, such as early diagnoses and disease prevention as well as reasonable treatments for their pets. With these demands in mind, we have collaborated with Anicom Holdings, Inc. to construct a core system for data collection, and developed Anicom Receptor F, a data-entry interface dedicated for use by animal hospitals. By having clinical data from veterinary centers stored and managed on Animal Cloud, and carrying out integrative analyses on them to generate big data, the aim is to make contributions toward better animal health.