Intel demonstrated the success of IoT applications in manufacturing by using Big Data analytics applications to deliver greater cost savings, predictive maintenance, and product productivity for their own manufacturing processes.
The volume, diversity and speed of data generated in manufacturing are increasing by anthoelec than most, creating opportunities for data analysts to gain a competitive advantage, meet changing market dynamics and increase production rate, productivity and efficiency.
Equipment across the factory is generating thousands of different types of data, such as unit production data at multiple levels, equipment operation data, processing data, and human operating data, which can be stored for short-term and long-term analysis.
While large manufacturers have used statistical process control (SPC) and statistical data analysis to optimize production for years, the new dominant component of today’s data provides new opportunities to deploy new approaches, infrastructure, and tools in analytics.
The manufacturing industry is willing to accept the use of Big Data, which is supported by higher computing performance, open standards, the availability of industry know-how and the vast availability of skilled data statisticians provided by professionals.
With access to the new smart production system, manufacturers will improve quality, increase production throughots, better understand the root cause of production problems and reduce machine downs and down times.
With these new business values and technological capabilities, manufacturers will be able to change business models and practices to optimize design for production capabilities, thereby improving supply chain management and introducing the use of customized production services to shorten the time it takes to market customized products for smart consumers across different geographical areas.
This article outlines a pilot project of Internet of Things (IoT) and Big Data applications at one of Intel’s manufacturing facilities, to show how data analysis applied to factory equipment and sensors can bring operational efficiency and cost savings to the production process.
With partnerships from Cloudera, Dell, Mitsubishi Electric and Revolution Analytics, this Big Data IoT analytics project is forecast to save millions of dollars annually along with business income.
How to extract value from all data in production ?
Big Data is characterized by huge data set with different types of data, which can be classified as structured, semi-structured or unstruc structured, as shown in Table 1 below.
Structured data matches neatly formatted tables, making management and processing relatively easy. Structured data has the advantage of being easy to import, store, query, and analyze.
This example includes production data stored in relation databases and data from production implementation systems and enterprise systems. On the other sex, unstructioned data such as images, text, machine log files, operator-generated change reports, and social collaboration platform text production can be in a raw format that requires decryption before meaningful data values can be extracted.
Semi-structured data is a form of structured data that does not match the official structure of the data models associated with relational databases or other forms of data tables, but noneever contains tags or other marks to separate sedation components and enforce hierarchies of records and data fields.
In manufacturing, the power of Big Data technology stems from the ability to integrate and correlation these types of data set to create business value through newly discovered insights. With the other value proposal of Big Data technology is that it allows manufacturers to synthesize and centrally synthesize different types of data effectively, expandable in the future.
In manufacturing, process transformation comes from a variety of factors, such as raw materials, recipes and methods, and equipment differences, driving manufacturers’ real business needs to transition to a scalable Platform-based Big Data solution with production and business requirements.
Machine data is closely related to productivity, quality, and output, thus providing valuable information to proactively detect processes that are out of control.
However, some production types produce Big Data files (gigabytes for several days for each type of tool, as in Table 2), limiting the ability to store, analyze, and extract useful information from them using common methods. Without the use of Big Data technologies, it will be difficult to visualize information in Big Data episodes from a variety of sources.
Building blocks for end-to-end infrastructure enable smart manufacturing from factory to data center
Figure 1 shows the high-level IoT production architecture for small to large data collections that form a data collection system and synthesizes different types of data from production areas and production networks, opening up the ability to visualize, monitor, and mine data for the new Business Intelligence (BI).
For example, architectures can: clean, extract, convert, and integrate structured data from existing databases and unstructioned data from tool sensors and log files from older devices in the data storage platform (e.g. Hadoop).
Data can then be displayed and analyzed by high-level factory applications running in different virtual machines on the same on-premises server as the data host.
Additionally, data can be accessed with other analytics or monitoring applications on the network. Other advanced capabilities may include running analytics in Hadoop or other file system types or running in-memory analytics for faster performance. The results of the analysis can be presented to the user through visual visibility in the smart business class of the network.
A compact dell system, PowerEdge * VRTX was chosen to host analytics software and Big Data in private cloud settings as an on-premise server. The system consists of the Dell PowerEdge VRTX with 25 900 GB hard disks and two Dell PowerEdge M820 servers, each equipped with four Intel® Xeon® processors from their E5-4600 product family.
The Intel Xeon E5-4600 processor product team offers a dense four-socket processor solution, cost-optimizing up to eight cores, up to 20 MB of end-level cache (L3) and up to 1.5TB of memory capacity, along with communication lines for faster data migration.
Two M820 servers host analytics and application software and the Hadoop Node, which runs in multiple virtual machines. Using Red Hat Enterprise Linux* for virtual data center operating systems provides a complete virtualization software solution for servers designed for a fully and scalable virtualization data center.
Analytics node and Analytics Apps
Figure 3 shows how software is allocated to different VM machines. VM year storage node runs different analytical and application workloads: Revolution R Enterprise * Revolution Analytics
- By linking to intel multi-threaded® high-performance MKL mathematical libraries, Revolution R Enterprise uses the power of multiple processors to accelerate general matrix computing. In addition, Revolution R Enterprise’s parallel, distributed algorithms enable Big Data statistical analysis to expand linearly with multi-core servers, high-performance computer clusters, Big Data devices, and Hadoop*.
- Revolution R Enterprise* from Revolution Analytics is an analytics software built on a powerful open source R statistical language. The software provides a secure, seamless data bridge between analytics solutions and enterprise software, thereby solving the main integration problem that businesses face when applying R-based analysis along with existing IT infrastructure.
- MonetDB*: an open source column-oriented database management system designed to provide high performance for complex queries for Big Data facilities, such as combining tables with hundreds of columns and millions of rows. MonetDB has been applied in high-performance applications for data mining, online analytics processing (OLAP), geographic information systems and online data processing.
- Data Flow in system architecture
- PostgreSQL*: a powerful, open source object relational database system used to process online transactions (OLTP).
- AquaFold*: an application server used to build and deploy quality production databases and reporting applications quickly. It includes capabilities such as multiple-source data syncing, cross-database data migration, data conversion and loading, data export/import schedules, and custom BI Dashboards. Support communication with various production databases in networks such as PostgreSQL, Microsoft SQL*, Oracle* and IBM DB2* with ODBC* and JDBC* connections.
- Hadoop Nodes * Four virtual machines are offered to run a basic Cloudera Hadoop cluster, consisting of four nodes, including a head node and three worker nodes.
- Apache Hadoop is an open source distributed software platform for scalable distributed computing. Written in Java, it runs on a cluster of standard industrial servers configured with live-mounted storage and cost-effective scales by adding economic nodes to the cluster.
- Cloudera Enterprise Data Hub* (CDH) provides a unified platform for Big Data by providing a place to store, process and analyze all their data resulting in expanding the current investment value and allowing for fundamental new ways to get value from their data. CDH is Apache’s 100% licensed open source and is unique in providing unified batch processing, SQL interaction and interactive search and decentralized role-based access controls.
Gateway Internet of Things (IoT gateway)
Gateway is based on Mitsubishi Electric’s Intel® AtomTM processor, known as the Mitsubishi Electric C Programmable Controller of the MELSEC-Q* series, which is used to safely synthesize and enter data into Big Data analytics servers.
Data entry is a process of confirming, filtering, and re-formatting data to make Big Data analytics software easier to operate.
Mitsubishi Electric C programming controllers of the MELSEC-Q series are embedded solutions equipped with many characteristic features of the intelligent system, including strong network connectivity and high computing performance necessary to process large amounts of data collected from sensors or over a network when supporting sophisticated system operations and controls.
At the core of this controller is a hardware platform based on Intel® architecture and wind river VxWorks real-time operating system*.
Mitsubishi Electric has developed the MELSEC-Q Series C Programming Controller to meet the diverse requirements of factory automation, including excellent reliability, tolerance of harsh environments and long-term usability. These features make it a powerful and reliable product that requires little maintenance for IoT manufacturing applications.
In place of the PLC used in normal programming logic controllers, the MELSEC-Q series C Programming Controller uses the international standard C language (C and C++) for more programming flexibility. This allows users to make the most of the software and develop their existing C language.
CIMSNIPER* is a data collection and processing package for mitsubishi Electric C programming controller of the MELSEC-Q series. It can collect process data (including SECS messages) and manufacturing equipment errors without modifying existing systems.
Big Data Analytics at Intel Factory
Over the past two years, Intel has developed more than a dozen Big Data projects that underpin both operational efficiency and bottom line.
Here are a few examples of these analytical applications:
Reduce product inspection time: Each Intel® chip produced undergoes several stages of thorough quality testing involving a complex series of tests. Intel found that by using historical information collected during production, the number of necessary tests could be reduced, resulting in reduced testing time. Deployed as proof of concept, the solution avoided a $3 million testing cost in 2012 for a range of Intel® CoreTM processors.
Improved production monitoring process: Processes that use a lot of data also come in handy Intel detects failures in its production line, which is a highly automated environment. Intel withdraws log files of production tools and tester across the entire factory network, which can be up to 5 Terabytes an hour By capturing and analyzing this information, Intel can determine when a specific step in one of its manufacturing processes begins to deviate from the usual tolerance.
Visualizing all production from start to finish, Intel has coordinated the pilot project here, in collaboration with Mitsubishi Electric, Cloudera, Revolution Analytics and Dell, successfully pioneering capabilities that have made great strides in using data mining tools to solve actual production problems , thus saving Intel millions of dollars by avoiding costs and improving decision-making. The main objective of this project is to extract the value proposals of the data and analyze the data to get better insights in predictive production and reduce production costs without reducing throughe or quality.
Intel experimental results
Case Study 1: Reduces non-production productivity losses by monitoring and analyzing machine parameters and timely replacing parts before they break down
An automated tester (ATE) is a machine designed to perform tests on various devices called tested devices (DUTs). ATE uses automated control systems and information technology to quickly perform DUT.11 System ATE measurement and evaluation tests with an automated positioning tool, called a processor, that puts DUT on the Test Interface Unit (TIU) so that it can be measured by device.
The faulty TIU will mis-classify good units as bad, which negatively affects Intel’s production operating costs. Faulty TIUs can cause DUTs to be mis-classified, including denial of good units. Intel’s production goal is to detect TIU errors before they occur so they can fix or replace them before units are mis-classified. When a faulty TIU mis-preies good units as bad, the units are eliminated. Some components have been replaced with spare parts during regular preventive maintenance, even if they are still functioning normally
to avoid such problems
Results and benefits
Analytics predicts up to 90% of potential TIU failures before being activated by the plant’s current online process control system. In this case, this helped replace faulty TIUs before rejecting good units, thereby reducing productivity losses by up to 25% .12 In addition, Intel could save on backup costs by reducing the need to replace spare parts before they failed during preventive maintenance resulting in a 20% reduction in backup costs.
Case Study 2: Reduces productivity losses by eliminating and minimizing incorrect pairing of in-device pickups
The module attached to the Ordnage is where the solder paste is printed on the lands on the base. Welded ordnages are placed into earth-mounted ordnages, and glued to hold them in place. The whole package passes through a reheated furnace, melting pastes and semi-balls leading to the ground.
Welded ordners are vacuumed into the small holes of the position head. The head is checked for oversy or missing or missing sellables. After the head is aligned with the substrate, the indiments are placed in the welding patch on the base. After the release of the Sale Ball, the position head is checked for any remaining Indiables. Finally, the substrates are inspected by the camera vision system for any missing or shifting transes.
Units with missing ordnies are faulty materials and loss of productivity. There are many scenarios in which the Indiables are missing in units, including insufficient vacuum pressure.
Results and benefits
By visualizing and correlation sensor reads with machine data and various enforcement system data, Intel was able to reduce productivity losses, optimize maintenance costs, and avoid sudden device downs. This allows technicians to proactively troubleshoot problems, during the journey experiencing predictive maintenance problems.
Case Study 3: Use image analysis to identify good or defective products
A machine vision device is a module that screenes units and ifies them into good and marginal units. Good units are sent forward to be processed while marginal units are inspected and identified by a production specialist as well or faulty. This craft process takes time.
To test and classify marginal units is complex and sometimes takes about 8 hours to successfully separate a series of units that are actually removed from marginal units. This is caused by the time it takes for units to reach the algorithm, flow to a separate module and eventually be separated. Image analysis allows to identify rejection unit moments after being inspected by the test module.
Results and benefits
The marginal images recorded at the machine visual equipment module are pre-processed. Each image, which is unstruc structured data, is resized, cut, converted to a grayscale before converting each pixel to binary.
The next stage of the process involves feature selection, in which unstruc structured images are defined by a set of distinct values. These values are then included in various Machine learning algorithms to determine the actual removal and removal of margins.
The image analysis shortens the time to save the actual rejection from a group of marginal units. Image analysis identifies errors about 10 times faster than the manually method.
Summaries and conclusions
Analyzing Big Data and the Internet of Things in manufacturing as a terminal platform is an important backbone to enable the vision of smart manufacturing. The platform is scalable and available in various configurations using existing industry standard building blocks.
Intel has integrated and validated an on-premise server analysis solution with data extracted from Intel’s own production network and from devices and sensors through the use of the Internet of Things gateway to validate the business value of the Internet of Things in manufacturing. Mitsubishi Electric C programming controllers of the MELSEC-Q series have been used to collect data.
The pilot project has close cooperation between factory engineers, IT department and industry experts from Cloudera, Dell, Mitsubishi Electric and Revolution Analytics. The team began leveraging existing machine performance and monitoring data, then proceeded to leverage Big Data analytics and modeling to enter data to predict potential excursions and failures.
It is possible to predict machine component errors that allow engineers to repair and prevent excursions, thus collecting huge savings from wasting production units, time to repair and machine components.
An integrated architecture of various software building blocks on Big Data Analytics Server and IOT gateway has been used. This framework can be applied and deployed to manufacturers who have not yet begun to take advantage of the intelligence available in production data. Manufacturers who have been using data to improve their efficiency can gradually add to their existing capabilities to develop their data mining and analysis capabilities to the next level.
For Intel, this pilot project is forecast to save millions of dollars (USD) annually with additional income for the investment business values that Intel is still recognizing. Benefits include improving the up and down time of equipment components, minimizing mis-classification of good units as bad (thereby increasing productivity and productivity), allowing predictive maintenance and reducing component failures.