Intelligent Automation for Manufacturing: Part 2

Conveyor belt running through lightbulb

This article is a guest post from Alex Marcy, President of Corso Systems. If you have any questions for Alex about manufacturing automation, please contact him here or at 775.750.0540

Previous Post - Intelligent Automation for Manufacturing - Part 1

One of the best ways to get more performance from your control system is to collect data from your process, analyze the data to measure where your process is today, then leverage your experience alongside the information to uncover ways to optimize your process.The first step in this journey is to collect data from your process (or revisit your existing configuration if you already have a system in place).

Collecting Data From Your Process

There are a few options on how you can collect process data, depending on the size of your operation, how much data you want to collect and how frequently you want to collect data.

If you have a small process, one way to get data is by using historical trends built into your HMI application: trending data this way is easy to configure and provides basic analysis capabilities directly in your application. The downside to this method is when you have a relatively small number of tags. This method also requires someone to use the HMI application to review the data, potentially adding license costs.

flow-trends

The middle-ground between HMI data and a historian is to use scripting in your HMI to write data into a database at regular intervals. This method can remove the licensing requirements to accessing the data, however it does require time to write, test and maintain the scripts (and can create performance problems as your information needs grow). You can also run into issues with storage frequency using this approach, as collecting data at high-speed intervals is resource intensive given the tools available.

There is one additional method available to you depending on your control system hardware. The approach collects historical data in your PLC/data concentrators and imports the data into a database. Again, this has time requirements to write, test and maintain your scripting, and can run into performance issues as your needs grow.

The most common solution is to install a process historian. A process historian removes the headaches involved with scripting, gives you access to more capable data analysis tools and makes it easy to correlate your process data with other information systems. This generally has a higher cost than writing some scripts to move data around, yet it gives you more functionality, better data compression, and gives you the ability to store data at any interval you might require. There are many process historians available, determining which one works best for you will depend on a variety of factors.

  • How the historian works with multiple facilities
  • How it interfaces with your control system hardware
  • What databases the process historian supports
  • How frequently the process historian can collect data
  • What are the IT requirements (network access, firewalls and CPU/memory)
  • What data analysis tools are compatible with the process historian

Collecting Data From Your Staff

Along with process data, other systems are capable of providing additional context to improve your operation. These typically include things like QA/QC or Lab Information Management Systems (LIMS), shift logs from your operators/supervisors, maintenance work orders, and process change management systems. Most of this data is collected manually and written on a sheet of paper, stored in daily Excel spreadsheets, or in some proprietary system only available to a certain department.

Now is a good time to assess which of these systems would be useful alongside your process data. The next post in this series will give you examples of this concept and how we have helped some of our customers move their systems from paper and Excel to centralized systems accessible by anyone.

Turning Your Data into Useful Information

After you have data automatically being collected from your process, and have centralized some of your manual data entry systems, the next step is to use data analysis tools to turn that data into information.

Depending on your control system and process historian choices, you will have a variety of off-the-shelf or custom data analysis tools to choose from. These may be as simple as trending tools, Excel Add-Ins, and reporting services, or as complex as a Statistical Process Control (SPC) or center-lining application to manage process setpoints and conditions based on product quality. Other popular analysis tools offer functionality to monitor OEE and downtime, manage energy usage and other process specific items like recipe management.

Most companies start out with a mix of trending tools, Excel Add-Ins, and reporting. This covers a lot of the bases for analyzing data and doesn't require much expertise in how the tools work to get value from them. This grouping has the added benefit of giving informational value to the entire organization.

  • Trending is great for operators/maintenance staff who want to see detailed information on a large number of tags
  • Excel is useful for process engineers/manager to easily query data they need to look at an perform calculations on
  • Reports can distill the complex process information into easy to digest KPIs for the corporate side of the business

After seeing the value in the information these tools provide, the next logical step is to move into OEE, downtime or process intelligence solutions. This will give you detailed analysis on situations most commonly affecting your productivity, and ways to increase it without adding more equipment to the line.

analysis-tools

Regardless of the tools, our customers have had the most success when starting with a small-scale rollout of their systems. This allows people to acclimate to the system, understand how it fits into their workflow and the value it provides.

Training

Finally, the most important piece of the information systems puzzle is to make sure people understand how to actually use the tools as part of their daily duties. We have seen a lot of companies implement a process historian, trending tools and sometimes the Excel add-in, then stop there. One or two people might act as internal champions and use the systems while pushing up the org chart without making any headway towards other people using the systems.

This is where bringing people on-board with the systems, training them on how to use the tools effectively, and making them easy-to-use comes into play. This should be part of any system design and rollout. Bring your people on board early in the process and get their input on how the systems should function. Ideally these tools should make everyone's life easier, even in the short term while learning how to use the tools properly. The only thing worse than not having data in the first place is spending time and money to build information systems no one uses.

Be sure to check out Part 3 or our Intelligent Automation series where we will show you how some of our customers have used similar approaches to improve their bottom line.