• Why Phaidra
  • Applications
  • Security
  • News
  • Careers
  • Blog
logo-mark

Cookie Settings

We use cookies to operate this website, improve usability, personalize your experience and improve our marketing. Your privacy is important to us. Privacy Policy.

Setupbreaker

March 11, 2024

breaker

8 min read

March 11, 20248 min read

Artificial Intelligence Readiness

Best practices to ensure an industrial facility is ready for artificial intelligence service integration

header-image

Share

It's all about the data.

Is data the new oil? Not necessarily. New oil reserves have inherent value because it’s a concentrated energy source and a vast infrastructure is already in place to unlock that value. For data, the mere presence of a lot of it is meaningless. It must be accurate, well-labeled, concentrated and accessible to become useful.

Artificial intelligence (AI) is transforming how we manage and optimize industrial facilities. By leveraging data insights, AI can help reduce operational costs, improve productivity, and enhance overall system performance. However, before integrating AI into any facility's control system, several key steps must be taken to properly prepare the data infrastructure.

An AI system is only as good as the data it is receiving. Inaccurate, incomplete, mislabeled or missing data inputted into the system will result in poor decision-making out of the system - garbage in is garbage out.

Phaidra has been implementing supervisory-level AI controls for the data center, pharmaceutical and district energy industries for over 5 years since our founding, and much of our leadership and management teams have decades of control programming and mission-critical cooling operations experience. No matter the sophistication of the organization there are some very common operational data issues that we discover within industrial BMS, SCADA and data historian systems. Many of those responsible for managing these systems are not missing these practices intentionally, but sometimes old ways of working just become the norm, even when outdated.

The following are some items to consider in preparing an industrial controls system for proper AI integration and making sure the data being collected is valuable for future use.

Collection Best Practices

The first step in preparing your control system for AI integration is to ensure that you are collecting high-quality data. This means that the data must be accurate, consistent, and reliable. To achieve this, you must establish data collection best practices and standards. This includes using high-quality sensors, calibrating them regularly, and setting up data validation and error-checking processes.

Improve Sensor Coverage:

Any industrial facility can likely benefit from a more nuanced collection of operational data. Sometimes, a facility may only be collecting data at a systemic level, like power consumption for a particular process instead of individual power consumption of each piece of power-consuming equipment within that process. Sensors are a good investment, especially considering the value that may be gained from the sensor data in the future. More granularity in the operational data will prove to be valuable in the long run.

For some critical sensors, like chilled water distribution temperature in a mission-critical cooling system, it is worth investing in redundant sensors so a fault can be detected and fixed quickly. Have 3 sensors covering these critical aspects so that if one is showing a reasonably different data point from the other 2, you know which one needs calibration.

Expand Sensor Coverage:

For any process you would like to optimize for any preselected KPIs, the more detailed data surrounding all that influences and directly impacts that process that is collected, the better. Sensors and data storage costs are considerably lower than they were over a decade ago so the added investment in more sensors that are of higher quality can pay off exceptional dividends when an AI system is integrated in the future.

Calibrate at a Regular Cadence:

Just like maintenance actions are taken on a scheduled basis, the sensors and those who are contracted to handle them must regularly review and recalibrate them. Some sensors can collect and store incorrect data for so long that it can very negatively impact not only a future AI solution but also any efforts to review historical data to determine the cause of an outage or failure. So don’t let your sensors degrade so much over time, just like you wouldn’t let your equipment continue to degrade without maintenance actions.

gated-content-image

AI Readiness Checklist: Operational Data Collection & Storage Best Practices

Download our checklist to improve your facility’s data habits. Whether you are preparing for an AI solution or not, these will help increase the value of your data collection strategies.

Storage Best Practices

A robust data storage infrastructure that can handle the volume of data generated by your facility and be easily accessible for review becomes very valuable for operations leaders. These best practices include selecting a suitable data storage solution, such as a data historian or a cloud-based solution, and ensuring that it can handle the data velocity, variety, and volume of your facility.

Semantic Trend Tagging:

This is a critical component of any type of future AI or even human data analysis integration. When collecting data, it’s critical to tag data with meaningful and descriptive labels that help identify patterns and trends. These tags are then used to train AI models, improve data visualization and provide context for analysis. To achieve effective semantic trend tagging, you must establish a standardized tagging framework that is consistent across your facility. This framework is designed to capture key operational metrics and performance indicators: energy consumption, production output and equipment efficiency.

An even more valuable company-wide dataset is possible to create if an operations team wants to go a step further. This will standardize semantic tagging across multiple sites. One good global standard to consider implementing is documented by Project Haystack

Usability of Stored Data:

Data is only as valuable as it is functionally available. The best data storage practices must include a way to handle variable and expressive data access to gain technical leverage from what is stored. Data contained in an inaccessible vault is worth as much as no data storage at all... nothing. Ensuring your historian supports programmatic access via protocols like OPC UA Historical Data Access (HDA) or Project Haystack’s HTTP API will provide for data availability at your discretion. When selecting a data historian solution, make sure to verify support for such an access method, as well as support for querying your data without crashing or otherwise failing due to load.

Ask your potential data historian provider to ensure you can export a few week’s to a month's worth of data at a time without causing instability.

Data Historian Configuration:

A data historian is a key component of any control system, and it is critical to ensure that it is properly configured to support AI integration. This includes setting up data collection intervals, data retention policies, and backup and recovery procedures. You must also ensure that the data historian is able to handle different types of data, such as time-series data, relational data, and unstructured data.

Data Governance and Security:

Finally, it is critical to establish robust data governance and security practices to protect your data from unauthorized access or breaches. This includes setting up access controls, implementing data encryption, and establishing backup and recovery procedures. It is always a good practice to review and ensure the best procedures are in place and enforced.

Preparing your control system for AI integration requires careful planning and attention to detail. By following these best practices, you can create a data infrastructure that is not only ready for AI integration but is collecting and storing high-quality accurate data ready for immediate use by the AI system or any human data analysis. Small investments in the expansion and standardization of data collection and storage can yield actionable insights that achieve significant improvements in operational efficiency, productivity, and performance.

Featured Expert

Learn more about one of our subject matter experts interviewed for this post

author-avatar

Chris Vause

Head of Product

Chris Vause is a member of the Leadership team at Phaidra and serves as the Head of Product. He sets the strategic direction for product development and leads the domain expert teams responsible for architecting AI systems for customers. Prior to Phaidra, Chris worked for Trane Technologies in various capacities for over 12 years - including as an Applied Systems Engineer, a traveling Applications Engineer, and a Building Automation Services Technician.

Share


Recent Posts
logo-morsecode
article-thumbnail

Security | May 8, 2024

Delivering supervisory level AI control of mission critical industrial systems requires advanced and forward thinking IT security infrastructure. Learn about how Phaidra puts security at the forefront.

article-thumbnail

Research | April 8, 2024

A thorough look into one of artificial intelligence's most promising forms, reinforcement learning and how it could revolutionize decision-making across industries.

article-thumbnail

Safety | February 22, 2024

How artificial intelligence and machine learning dramatically improve organizational decision-making

Phaidra Logo
linkedin
Why Phaidra
linkedin
Privacy Policy
© 2024 Phaidra, Inc. All Rights Reserved.
Alfred