Two case studies highlight the development of control systems in manufacturing automation

What was initially planned as the refurbishment of an automated guided vehicle (AGV) system serving a press at the Toyota plant in Burnaston, UK, turned into something much more radical when the company realised that a simple upgrade would not satisfy new requirements imposed by the impending launch of its Auris and Avensis vehicles.

Instead the AGV system, which had originally comprised eight vehicles but had been reduced to just one serviceable unit, was removed and replaced by a completely new system that uses three rail-guided trolleys. But the change in materials handling technology was also accompanied by a complete renewal of the control technologies to provide a system that is now much more flexible, efficient and “future-proofed”.
 
That assessment is provided by Todd Montpas, automotive and tyre market development manager for Rockwell Automation in the US, whose UK-based solution provider AND Automation installed the new set-up.

At its core, the system has a CompactGuardLogix programmable automation controller (PAC) from Rockwell company Allen-Bradley, with communications to all primary devices and various input/output (I/O) points via Ethernet/IP. Moreover, each trolley car has remote controls, I/O, safety I/O and on-board intelligence to look at and react to dumb targets on walls and machines. The main panel feeds remote panels which house Allen-Bradley PowerFlex variable speed drives for positional control, as well as safety I/O and I/O for all three trolleys.
 
Gaining greater control
According to Montpas, a key aspect of the overall system is the way safety is “programmable” rather than “hard-wired”, a characteristic exemplified by the “safe speed” and “safe torque” capabilities provided by the controllers. This makes it possible for people to enter the area behind the guard screens for the installation without the trolleys coming to a complete halt. Instead, they will slow down to allow production operations to continue, albeit at a reduced rate – “one station can be worked on while the others are doing their job,” he confirms. Thus a requirement for unplanned maintenance does not have the same disruptive effect as it might otherwise. Moreover, the capability involved is "programmed directly into the drives” and does not require any supplementary hardware, such as relays or speed controllers. Hence Montpas states it is very “cost-effective”. In addition, recovery times after an unscheduled interruption to production will be much faster than previously.
 
Meanwhile, another attribute of what Montpas terms the “network-based approach” is greatly enhanced diagnostics – a highly valuable capability in a 24/7 production environment. Remote monitoring would certainly be feasible at some future point, he says, though for the moment Toyota keeps that responsibility in-house. The point about using Ethernet/IP in this instance, Montpas adds, is the extent of the “data capabilities” it provides. Alternatives are available but the reality is that it is becoming standard. “You do not need any special hardware or switches and most people are moving towards it,” he states.

IVS cameras

The whole installation illustrates a number of salient points about modern automated manufacturing systems. One is simply the ease and speed with which such systems can be installed. The system at Burnaston went from specification to operation in only a few months, enabled by the ability to reuse much of the wiring of the previous system. “You can tie in to the existing infrastructure,” notes Montpas, who adds that actual installation took advantage of the annual shutdown period for the plant.
 
Another is the enhanced provision for adaptation to suit new requirements as they arise, because functionality may be changed simply through reprogramming rather than reconfiguration, which is quicker and easier to carry out. Effectively, Montpas argues, new systems can be created without any significant further investment.
 
Vision system – case study
A UK supplier of automated inspection equipment, Industrial Vision Systems has just delivered an interesting project intended to reconcile the high-specification finishing of a car interior with passenger safety regulations. The system, which has been supplied to the Chinese operation of a UK-based manufacturer of luxury vehicles, ensures that the stitching on the leather cover of a front console is positioned precisely enough to allow it to act as a perforation through which an airbag can burst unimpeded from the opposite side if it is activated by an impact.
 
According to Earl Yardley, director of the company, the system is identical to a number of previous implementations supplied to the UK operation of the customer but is still unusual in general automotive industry terms. He explains that it involves the use of two “standard industrial video cameras” which are fitted to either side of the press that applies the leather to the console substructure. The cameras are modified to work in the infrared (IR) spectrum via appropriate filters. IR LED lighting is then used to illuminate the inspection process, which otherwise takes place in ambient lighting conditions with no need for any shielding.
 
IVSThe IVS system enables precise stitching on the leather cover of the front console, which is important for passenger safety
Yardley says that the use of IR illumination reduces the effect of changes in ambient conditions, as well as helping the system to cope with different combinations of leather colours. The nature of the process means that it is impossible to manoeuvre the cameras into a position that is very close to the surface that is being inspected, so a combination of thresholding and gauging within the vision inspection routines measures the position of the leather to a datum – an H-shaped pattern in the stitching is the crucial reference point – and automatically confirms the position before sealing is carried out.
 
Difficult demands
The demand the system has to satisfy, though, is exacting and inflexible. Yardley confirms that the manufacturing process has to be validated to EU regulations which mandate 100% inspection. The level of accuracy required by the safety-critical nature of the application is 200 microns.
 
The application involves mounting two cameras on the Z-axis of the press so that they move down with the press as it positions itself to secure the leather to the console. Before the leather is finally sealed onto the main structure, a programmable logic controller instructs the cameras to inspect the perforation prior to the press being permitted to continue with the process. Though the console structure is loaded onto the press manually, the inspection process itself is fully automated.
 
When vision technologies are used as a quality control technique at a much broader level across a whole assembly plant, they can prove to be problematic. The sheer amount of data they generate can result in the creation of multiple isolated databases from which it is impossible to draw macroscopic information to identify the fundamental causes of faults and initiate remedial action.
 
Issues with false rejects
That situation can even create a system of false error reporting that can result in unnecessary rejects and interruptions to production flows. In short, a wide-area, vision-based quality control system is a “challenge to set up” and if not done properly can produce a large number of spurious “first time quality hits”. So says Dan McKiernan, president of eFlex Systems, the US-based supplier of automated ‘assembly support’ technologies. He confirms that in his company’s customer base “false rejects” have previously been the primary quality-related problem requiring a solution.
 
Sometimes, McKiernan continues, the issue can be resolved fairly simply – it may be that particular circumstances at a single camera location such as poor illumination may be to blame. However, he also explains that modern vision systems can provide far more than a simple pass/fail capability. They can instead generate significant 'metadata' in addition to straightforward images that can be used to support detailed analysis. In turn, this can be used for statistical graphing to predict whether trending is moving out of tolerance. McKieran gives the examples of:
 
• Events such as start-up, user log-on and log-off, switching into online and offline operation, and job changes
• Changes to system settings
• Changes to settings in a track-and-trace job, as well as batch changes
• Changes in focus, lighting and part position.
 
None of this is possible without a single database that stores, integrates and makes accessible all the data gathered by a factory-wide system. Right now, says McKiernan that is all too often not the case. Instead there may just be a series of inspection stations that create a dispersed repository of images with no overall “coordination or validation” – a “bucket”, as McKiernan bluntly terms it – from which images might be automatically deleted as local storage capacities are exceeded.
 
The eFlex system must cope with the size of a database and the rate at which it will be constantly augmented
By itself, though, a unitary database will not be enough. “You also need a feedback loop and that requires a reliable, stable network,” McKiernan states. The next essential is a “common naming convention across all cameras” that enables every single image to be identified and traced to a specific place and time – “what it is, where and when it came from and what camera took the image.” Then the system must cope not just with the size of the database but the rate at which it will be continuously augmented – possibly as high as “two hundred images a minute”.
 

As far as eFlex’s own system is concerned, says McKiernan, that last requirement in particular has dictated a move away from a previous reliance on the Windows operating system to its Linux counterpart. The migration, which took place a few years ago, has enabled it to deploy a database technology, Mongo DB, which is robust enough to cope with those demands. It has also, he adds, facilitated the use of the latest HTML5 interface to enable distributed, simultaneous viewing of the database. As for the total size of the database that may need to be handled, McKiernan says that the largest one in the company’s user-base has 96 terabytes of storage, a figure that in practice comes down to about 70 terabytes, allowing for a margin of redundancy. It currently stores about 70 million images but is “not yet full”.
 
Data gathering & integration
A recent project in the US involving what eFlex will only describe as the “2 million square foot transmission manufacturing plant with 2,000 employees of a global manufacturer” exemplifies many of the points McKiernan is keen to highlight. The plant had some 50 PCs gathering roughly 200,000 images per day, but this system was not integrated and required staff to visit each storage area to retrieve images associated with a particular serial number. Even then, there could be no certainty that images would be available. Nor were there any reporting or data gathering capabilities.
 
A $323m investment project to produce a new six-speed, front-wheel drive unit made this set-up inadequate. Therefore eFlex installed a system which linked the existing cameras to a single server running its Vision software under a single naming convention. The capabilities it now provides include: information related to server health, such as hard drive and network bandwidth usage; incoming and non-conforming file monitoring; and server status and activity monitoring. At a local level, camera connection status, image rate and time and date of last file are also provided and monitored.
 
Interestingly, McKiernan adds, eFlex is now experiencing a perceptible demand for the provision of a similar but “stripped back” level of capability from the SME supplier community. Even if they do not need the level of analytics that an OEM might, he observes, the fundamental requirement for “traceability” – the need to check back to the state of a specific product item when it left the factory – remains.