In the demo, I use Process Simulate to simulate an environment with both the UR robot which is operated directly by ROS and a KUKA robot, operated by Process Simulate. You'll see human, other equipment in use as well and everything is synchronized by Process Simulate’ simulated PLCs.
As you'll see in the demo, the UR robot picks boxes from a conveyor and moves them to the right container, according to color. The color is determined by the OpenCV package, which is loaded by ROS.
Once a container is full, The KUKA robot picks it up and moves it to the removal area for the human to take it away.
The simulation uses proximity, light and vision sensors to provide information that is gathered in Process Simulate. Some of it is processed locally and some of it is sent in real time, on each time interval, to the ROS environment.
In the ROS environment, the only thing which is modeled is the UR robot. The ROS-controlled robot uses OpenCV and MoveIT packages to understand its surroundings and plan its path. ROS will send the information regarding the next location of the robot to Process Simulate on each time interval.
To summarize, the step forward represented by the demo is the collaboration between ROS packages and the Process Simulate environment, including use of simulated industrial robotics and equipment.
If you are looking to share interesting view points, use cases and environment challenges which are related to ROS, you are more than welcome to reach out at: