The Ford Transmission Plant in Livonia, Michigan, has been the location for a major innovation in automotive assembly operations, utilising AI to support the use of a robot in a particular automated assembly role. Mike Farish reports
The application of AI in this case is to support the installation of torque converters into the transmission cases used in a range vehicles including the Ford Bronco Sport, Edge and Escape and the Lincoln Nautilus and Corsair. Confirmation is provided Harry Kekedjian, industrial control systems manager based at the company’s Advanced Manufacturing Centre in nearby Redford. “AI was already used elsewhere in Ford but this is its first application in robotics for high-volume assembly,” he states. “In fact, we think it might be an industry first.”
Kekedjian explains that torque converter assembly is an intrinsically difficult task requiring the precise handling of heavy, awkward, sharp-edged components. Ford first automated the procedure a decade ago not least to mitigate safety concerns for human operators. But, he says, the complexity of the task particularly the very tight clearances requires a degree of dexterity that is nevertheless more suited to people rather than the “pre-engineered paths” associated with robot operations.
Opportunities and challenges in automation
In consequence even the automated process could still be problematical both in its programming and implementation. “We had to go through whole set of parameters to optimise the process,” Kekedjian explains. “Even then it was a static solution that didn’t lend itself well to minor process changes or drift.” Moreover, the whole set of procedures had to be repeated whenever a new vehicle model was introduced.
Even the commissioning in February 2018 of a new assembly cell using two six-axis ABB robots, one for material handling and one for the crucial insertion operations, still did not satisfy all requirements though Kekedjian prefers to say that “areas of opportunity” for further improvement were identified. “Cycle times varied quite a bit and introducing new models was still pretty painful,” he notes.
Hence the need was for something that could integrate the flexibility and adaptability associated with people with the uniformity and repeatability associated with robots. It was found through the addition to the cell of the AI software Symbio DCS from Symbio Robotics over on the West Coast of the US in Emeryville, California.
“The AI converged on solutions right from the start with a 50% reduction in the time and number of parts required to get to an optimal set of parameters” – Harry Kekedjian, Ford
Kekedjian explains that the attraction of AI was that it could facilitate not just faster initial programming but also continuing process enhancement once production was underway. “We thought AI would be useful because we could implement the algorithm so that it can continuously adapt from one cycle to the next and would not just be limited to the training period,” he states.
“There are two levels to the algorithm,” says Kekedjian. “One is learning in-cycle, taking sensory feedback characterising how much friction it is feeling and so very much emulating the way people learn. The other is inter-cycle learning in which it catalogues whatever has been successful as a set of parameters and ranks them over time.” In consequence, he continues, “it builds up a database of what has been successful.”
Upgrading the process with AI
Kekedjian’s mention of friction highlights another key distinction between the Livonia assembly cell and previous AI-enhanced robotic operations within Ford. Existing AI-enhanced robotic operations, he explains, are used for defect detection and utilise vision systems to gather the data that drives them. But the sensor technology guiding the Livonia assembly cell is based on force-feedback. “There is a force sensor on the end-effector of the robot and this is the first force-based AI robotic solution that we have deployed,” he confirms. Hence as an AI-enhanced robotic operation the cell is currently unique in both the way it gathers data as well as in the type of operation it carries out.
Kekedjian says that the upgrading of the cell took place “over a week-end”. But he adds that intensive preparatory work was carried out earlier in the Advanced Manufacturing Centre. “We proved out the AI beforehand and ran many parts to be confident it would work,” he states.
Indeed, as Kekedjian confirms, process improvements followed very quickly. “The AI converged on solutions right from the start with a 50% reduction in the time and number of parts required to get to an optimal set of parameters,” he confirms. “Pretty well straight away there was a 15% improvement in throughput, largely driven from cycle time improvement. Cycle time variation was also significantly reduced.” He adds that while overall cycle time in the cell is about 20 seconds the assembly task carried out by the AI-enhanced robot takes about five seconds and that is where the improvements occurred.
Unsurprisingly Kekedjian confirms that the company is “currently evaluating where we could replicate this particular application.” But he is also pragmatic. “AI doesn’t apply to everything,” he says. “It is another tool in the toolbox. But it is obviously good for adaptive solutions and we can see AI playing an increasing role in complicated assembly tasks.”
“At Symbio we focus on three capabilities – force feedback or the equivalent of a sense of touch for a robot, computer vision and machine learning, in other words using data generated by a process to optimise that process” – Max Reynolds, Symbio
Improving performance in existing systems
The potential for AI capabilities to facilitate the enhanced automation of assembly operations in automotive manufacturing is confirmed by Max Reynolds, CEO and co-founder of Symbio. Walk around the shopfloor in almost any automotive plant, he observes, and “you will find welding use cases already nearly 100% automated, but for assembly operations it is probably less than five percent.” Symbio has therefore “primarily been focussed on developing software that allows customers both to automate use cases in assembly that were not possible before and also to improve the performance of existing automation.”
Reynolds is quite matter-of-fact in his definition of AI. It is, he says, simply an “umbrella term that refers to a broad range of algorithms and technologies.” But he is more specific in identifying how Symbio aims to exploit it. “At Symbio we focus on three capabilities – force feedback or the equivalent of a sense of touch for a robot, computer vision and machine learning, in other words using data generated by a process to optimise that process.” Of those he says that vision technologies have been around for a couple of decades and force feedback for perhaps half that span. But the new capability that is being brought into industry now using AI is data-driven machine learning driven by those already existing sensor technologies.
This is, Reynolds says, particularly apposite for “moving line applications”, which makes it especially relevant to many automotive industry requirements. “It facilitates insertion, fastening and dispense tasks on a moving line that would otherwise be unfeasible,” he explains. In fact, he continues, Symbio “has supported moving line applications for the last 18 months and we have had practical working solutions in automotive plants for that time.” Though he cannot provide further details at the moment he confirms that apart from the Ford Livonia application Symbio is also working with Nissan and Toyota on a number of solutions in both powertrain assembly and final assembly in the US.
A point that Reynolds is keen to stress, though, is that the AI software can work with existing off-the-shelf hardware. “We work primarily with existing robot and sensor hardware,” he says. “The essential enabling technology for Symbio DCS is industrial IoT and our software platform is compatible with most of the networking protocols that off-the-shelf robotics companies provide.”
At Ford Livonia, for instance, “the only modification we made was enabling a networking protocol on the motion controller and swapping analogue force torque sensors for digital counterparts.” The latter move, he adds, is very much in line with “the shift from analogue sensing procedural programming to net-enabled digital interfaces” that he says is now taking place more widely.
Watch AI enhanced automation at Ford here:
No comments yet