Cost effective and simple control and automation

This post is a translation of an article originally appearing in SPS-MAGAZIN 8 2016. The original article can be found here.

The Institute of Fluid Mechanics at the Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) has developed an automation system that meets the requirements for testing of pilot-scale plants while still giving the flexibility needed for research. The following report describes the system, which is based on Arduino and Raspberry Pi.

Authors: Liam Pettigrew, Rolf Zech, Prof. Dr.-Ing. habil. Antonio Delgado

The Institute of Fluid Mechanics at the FAU in Erlangen was looking for a cost-effective yet easy-to-use solution for automation of laboratory and pilot scale processes. It was decided to develop a system that was based on open-source hardware and software. Open-source hardware systems for automation have been used by hobbyists and developers for many years and cannot be compared with larger industrial control systems. Open-source hardware usually offers only a limited number of digital and analog inputs and outputs, which puts them in the compact controller category. An alternative system that implements existing open-source hardware but removes their limitations was developed and built by engineers and technicians at the Institute of Fluid Mechanics. This system is modular, flexible, extensible, inexpensive and easy to program. The Arduino platform was implemented as the controller for the system.

concept_s
Design of the lab-scale system. An optional extension is shown for future expandability. Multiple systems can be connected over the Ethernet interface. Connection over serial USB is possible directly with a PC rather than using a Raspberry Pi. The process devices shown are for example only. (Figure: Friedrich-Alexander-Universität Erlangen-Nürnberg)

Industry related research

At universities and in laboratories, students and researchers often need to control and automate processes for research projects under tight deadlines. The students and researchers are specialists in their fields, but have little or no experience with professional PLC hardware and software. It is therefore very difficult for them to familiarize themselves with the complicated instrumentation and software in short periods of time. The question can also be raised if a professional automation system is required for smaller research projects. However, industrial partners often wish to see research results that are application-orientated and compatible with industry standards, where an appropriate control system with a high degree of accuracy manages everything.

Quick, easy and cheap

Process automation in research with industry partners must be able to be implemented quickly, so that scientists can concentrate on the process being studied in proper detail. An easy-to-use programming language, that is already well-known from other applications, is a prerequisite. The implementation of the automation system can be further simplified with the free availability of libraries and examples for quickly completing different tasks. Data acquisition and system control should use methods that are compatible with standard software that the user does not need to newly learn. A big challenge for smaller research projects is the tight budget and timeline given for the implementation of process control when the research itself is not focused on automation. Therefore, a PLC based automation solution must be inexpensive, license-free and easily understandable.

Free flow of information

Although commercial solutions for compact control already exist (E.g. Siemens Logo, Rockwell MicroLogix or Eaton easy, etc.), they are proprietary and protected systems. Flexible solutions, such as those often needed in research but which may not be typical in industry, require bypass capabilities and means of variation. Problems that often occur during research can, by using open-source hardware and software, more easily be solved with access to system information and the large communities on the internet where ideas and solutions are openly and freely exchanged.

DSC_0772
Housing for the Arduino module. The compact design allows for easy access to the USB port, I2C connectors and the 24V power supply. (Photo: Friedrich-Alexander-Universität Erlangen-Nürnberg)

Arduino and Raspberry Pi

The Arduino is a low cost microcontroller board. It was originally developed in 2005 as a teaching tool for students at the Interaction Design Institute Ivrea. The Arduino has since become one of the most popular ‘do-it-yourself’ components in the world. Over 700,000 official boards had been registered when last recorded in 2013. The Raspberry Pi is a low cost single board computer in credit card format, which was developed in 2009 in the UK by the Raspberry Pi Foundation to promote the study of computer science in schools. About 5 million Raspberry Pis had been sold in the three years since their inception, making it the best-selling British computer. The strength of these two modules is the good combination of open-source electronics and software, which includes freely available source code and an easy-to-use and free development environment. Other components produced by companies such as Texas Instruments and Infineon are similarly constructed and can be implemented just as easily.

Modules for control

The system developed by the Institute of Fluid Mechanics is composed of individual modules, which are interconnected via an I2C bus. The Arduino operates as the controller module, while the other modules act as interface circuits for digital or analog inputs and outputs. The modules also act as measurement amplifiers for different sensor technologies. Each module has an address through which it can be controlled, where 8 digital output modules containing 8 switching channels each (certain modules contain 16 channels) can switch up to 128 channels. Each module has an I2C bus controller with switch amplifiers, relays, and contactors where small loads of up to 100 mA per channel can be controlled directly. High side switching stages allow for conventional ‘relay to ground’ wiring. Analog modules can process up to 64 channels using the available addresses. Another I2C bus controller type allows for a further increase of inputs and outputs by 64 or 128 channels. A multiplexer and bus driver allows for the bus system and structure to be further expanded and developed. As can be seen in the pictures, the modules are contained in DIN rail housings with plug-in terminals. The housing width for each module is 22.5mm. The supply voltage can be anywhere between 12 and 30 Volts (typically 24 VDC), therefore meeting electrical control cabinet requirements. The programming and connection of the controller with a PC is through the front of the module via USB is possible. A module for a Raspberry Pi allows it to be installed as a server module within the cabinet. The server uses a custom Java software that sends commands and collects all data from the distributed Arduino controller modules.

DSC_0773
Finished modules in a control cabinet. The 24V power supply and I2C bus connect the modules on the upper side. Below are the I/O connections for the individual modules. (Photo: Friedrich-Alexander-Universität Erlangen-Nürnberg)

Open PLC without borders

Due to the wide range of addresses available for modules, enough channels are available for larger systems. The modular design allows for flexibility in construction, i.e. the PLC could consist of only analog or digital channels if required. The I2C bus was introduced by the company Philips in the 80s for television systems and is today used in everything from chip card readers and household appliances to flashing lights in the automotive industry. Data transfer rates and the bus capacitance (capacitive load) can be limited when using I2C. However, this can be compensated by using suitable components and circuits. The newest generation of Arduino microcontrollers are 32-bit, have a much larger memory and a higher clock speed making them more suitable for complex projects.

 

 

Machine learning in the water industry

The past few years have seen a huge surge in interest in artificial intelligence (AI). There are a number of factors that have contributed to this. The large increase in data available and computing power that can crunch it have been two big factors. Researchers in machine learning techniques have also been able to combine existing techniques and algorithms into new methods that can utilize these emerging resources.

It is difficult to avoid the articles on Facebook’s and Google’s usage of artificial intelligence. Magazine and newspaper articles are becoming flooded with buzz words and important people talking about the dangers of AI and how Skynet is going to take over the world just as Mr Cameron predicted.

But once we move past the hype and look at what is actually happening we see that the methods being used are the same as or very similar to the methods used in statistics for making predictions from data for over 50 years. These techniques are anything but scary and are actually very important tools for providing us with valuable insights into the large and unwieldy amounts of data we are capable of currently generating.

Obvious existing examples in the water and wastewater industry include prediction of water supply and demand in cities, investigations into potential outbreaks through water supply systems, and environmental impacts of wastewater treatment and disposal. More recently researchers have been focusing on the potential to estimate the effluent quality of wastewater treatment plants using the large amount of data generated by them to train prediction models. The data generated is only going to increase at these plants as new and advanced sensors become more prevalent.

Another interesting aspect is the possibility to learn new water and wastewater treatment strategies from these machine learning algorithms. The classic example is that of TD-Gammon, where the self-learning algorithm was able to eventually beat even the best players and even changed the way people played the game as it introduced new concepts and strategies for winning the game that had not been thought of before. This method combined reinforcement learning and neural nets, where the neural net learns an action strategy given certain state variables and potential rewards.

rlearning

In this case the environment is the game Backgammon, but Google’s DeepMind have used this same basic concept to recently beat top players in the more difficult (for AI atleast) game of Go. Humans are already learning from the tactics that the machine is using for board games, could we also learn of better ways to treat and distribute clean water from a machine?

The advantage of using old arcade and board games for machine learning is that the environments are easily definable and have strict boundaries and rules. The machine can learn by making hundreds of thousands of errors in simulations before having to take on a real human. Water and wastewater treatment is anything but easily definable! We also can’t let a machine expel millions of liters of untreated wastewater into our rivers and streams for the next hundred years until it learns how to treat the wastewater properly.

The solution is to produce an accurate enough simulation of the treatment plant that a machine can train itself on. However, this is in itself a very difficult problem due to the multitudes of microorganisms that can come into play and the constantly changing composition of wastewaters. Benchmark simulation models exist for testing of control strategies, but even these standardised models require a fair amount of parameter calibration and variable initialisation to obtain a decent representation of the plant to be tested. Perhaps new information becoming available regarding the microorganisms present in these treatment facilities can be used to produce more accurate models with less of the specialised ‘research lab only’ measurements and approximation required for the current generation of models. Maybe the next generation of models will be simulating the entire ecology of the plant right down to the cellular level? Imagine what a machine could learn from that!