Human-Robot Collaboration (HRC) is increasingly integrated into industrial settings, combining the efficiency of automation with the flexibility of human workers. To ensure safety, the ISO/TS 15066:2016 standard outlines four types of collaborative operation. Among these, Speed and Separation Monitoring (SSM) emerges as the most promising for enhancing accessibility in shared workspaces while maintaining high throughput. However, current implementations of SSM face significant challenges due to hardware, software, and regulatory limitations. Realizing the full potential of dynamically changing safety zones requires precise, real-time data on speed, trajectory, and intent of both human and robot. Unfortunately, existing monitoring sensors and algorithms are unable to reliably acquire these measurements. Moreover, even if such data were obtainable, it is not yet safety-rated for industrial applications. Ambiguities within ISO/TS 15066 and the lack of standardized terminology for different SSM methods further complicate integration. This paper introduces a refined classification of SSM based on separation distance calculation (Fixed Sized, Variable Sized, Variable Shaped) and monitoring approach (Static, Mobile), providing a structured framework for evaluating SSM implementations. While Fixed Sized SSM is widely used due to its simplicity, it lacks the real-time adaptability required for optimal collaboration. In contrast, Variable Sized and Variable Shaped SSM dynamically optimize safety zones but remain underutilized due to technological and regulatory barriers. The second categorization distinguishes between Static Monitoring, where the zones have a fixed position, and Dynamic Monitoring, where they adapt to the movement of the robotic system. By providing a structured terminology and exploring these categories with examples and research, this paper aims to advance the understanding and implementation of SSM. Addressing current challenges and ambiguities in standards is critical for the broader adoption of SSM, paving the way for safer, more efficient, and accessible collaborative robotic systems.
References
[1]
Yitmen, I. and Almusaed, A. (2024) Synopsis of Industry 5.0 Paradigm for Human-Robot Collaboration. In: ArtificialIntelligence, IntechOpen, 1-21. https://doi.org/10.5772/intechopen.1005583
[2]
Lakshminarayanan, S., Kana, S., De San Bernabe, A., Turlapati, S.H., Accoto, D. and Campolo, D. (2024) Robots in Manufacturing: Programming, Control, and Safety Standards. In: Patel, C.D. and Chen, C.-H., Eds., DigitalManufacturing, Elsevier, 85-131. https://doi.org/10.1016/b978-0-443-13812-6.00011-7
[3]
Halim, J., Eichler, P., Krusche, S., Bdiwi, M. and Ihlenfeldt, S. (2022) No-Code Robotic Programming for Agile Production: A New Markerless-Approach for Multimodal Natural Interaction in a Human-Robot Collaboration Context. FrontiersinRoboticsandAI, 9, Article 1001955. https://doi.org/10.3389/frobt.2022.1001955
[4]
George, P., Cheng, C., Pang, T.Y. and Neville, K. (2023) Task Complexity and the Skills Dilemma in the Programming and Control of Collaborative Robots for Manufacturing. AppliedSciences, 13, Article 4635. https://doi.org/10.3390/app13074635
(2016) ISO/TS 15066: 2016 Robots and Robotic Devices—Collaborative Robots. https://www.iso.org/standard/62996.html
[7]
(2014) ISO 13482: 2014 Robots and Robotic Devices—Safety Requirements for Personal Care Robots. https://www.iso.org/standard/53820.html
[8]
Kim, Y.G., Xiloyannis, M., Accoto, D. and Masia, L. (2018) Development of a Soft Exosuit for Industrial Applications. 2018 7thIEEEInternationalConferenceonBiomedicalRoboticsandBiomechatronics (Biorob), Enschede, 26-29 August 2018, 324-329. https://doi.org/10.1109/biorob.2018.8487907
[9]
Bogue, R. (2017) Robots That Interact with Humans: A Review of Safety Technologies and Standards. IndustrialRobot: AnInternationalJournal, 44, 395-400. https://doi.org/10.1108/ir-04-2017-0070
[10]
Kim, Y.G., Little, K., Noronha, B., Xiloyannis, M., Masia, L. and Accoto, D. (2020) A Voice Activated Bi-Articular Exosuit for Upper Limb Assistance during Lifting Tasks. RoboticsandComputer-IntegratedManufacturing, 66, Article 101995. https://doi.org/10.1016/j.rcim.2020.101995
[11]
Yang, S., Garg, N.P., Gao, R., Yuan, M., Noronha, B., Ang, W.T., etal. (2023) Learning-Based Motion-Intention Prediction for End-Point Control of Upper-Limb-Assistive Robots. Sensors, 23, Article 2998. https://doi.org/10.3390/s23062998
[12]
Yen, S., Tang, P., Lin, Y. and Lin, C. (2019) Development of a Virtual Force Sensor for a Low-Cost Collaborative Robot and Applications to Safety Control. Sensors, 19, Article 2603. https://doi.org/10.3390/s19112603
[13]
Klimaszewski, J., Janczak, D. and Piorun, P. (2019) Tactile Robotic Skin with Pressure Direction Detection. Sensors, 19, Article 4697. https://doi.org/10.3390/s19214697
[14]
Tsuji, S. and Kohama, T. (2019) Proximity Skin Sensor Using Time-of-Flight Sensor for Human Collaborative Robot. IEEESensorsJournal, 19, 5859-5864. https://doi.org/10.1109/jsen.2019.2905848
[15]
Cheng, G., Dean-Leon, E., Bergner, F., Rogelio Guadarrama Olvera, J., Leboutet, Q. and Mittendorfer, P. (2019) A Comprehensive Realization of Robot Skin: Sensors, Sensing, Control, and Applications. ProceedingsoftheIEEE, 107, 2034-2051. https://doi.org/10.1109/jproc.2019.2933348
[16]
Ankit, Ho, T.Y.K., Nirmal, A., Kulkarni, M.R., Accoto, D. and Mathews, N. (2021) Soft Actuator Materials for Electrically Driven Haptic Interfaces. AdvancedIntelligentSystems, 4, Article 2100061. https://doi.org/10.1002/aisy.202100061
[17]
(2006) Directive 2006/42/EC of the European Parliament and of the Council of 17 May 2006 on Machinery, and Amending Directive 95/16/EC (Recast). Official Journal of the European Union, 157, 24-86. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32006L0042
[18]
Kumar, S., Savur, C. and Sahin, F. (2021) Survey of Human-Robot Collaboration in Industrial Settings: Awareness, Intelligence, and Compliance. IEEETransactionsonSystems, Man, andCybernetics: Systems, 51, 280-297. https://doi.org/10.1109/tsmc.2020.3041231
[19]
Vicentini, F. (2020) Terminology in Safety of Collaborative Robotics. RoboticsandComputer-IntegratedManufacturing, 63, Article 101921. https://doi.org/10.1016/j.rcim.2019.101921
[20]
Solawetz, J. (2020) What is YOLOv5? A Guide for Beginners. https://blog.roboflow.com/yolov5-improvements-and-evaluation/
[21]
Moel, A., Denenberg, S. and Wartenberg, M. (2022) Implementing Effective Speed and Separation Monitoring with Legacy Industrial Robots—State of the Art, Issues, and the Way Forward. In: Aldinhas Ferreira, M.I. and Fletcher, S.R., Eds., IntelligentSystems, ControlandAutomation: ScienceandEngineering, Springer International Publishing, 235-254. https://doi.org/10.1007/978-3-030-78513-0_13
[22]
Terreran, M., Lamon, E., Michieletto, S. and Pagello, E. (2020) Low-Cost Scalable People Tracking System for Human-Robot Collaboration in Industrial Environment. ProcediaManufacturing, 51, 116-124. https://doi.org/10.1016/j.promfg.2020.10.018
[23]
(2024) ISO 13855: 2024: Safety of Machinery-Positioning of Safeguards with Respect to the Approach Speeds of Parts of the Human Body. https://www.iso.org/standard/80590.html
[24]
(2011) ISO 10218: 2011: Robots and Robotic Devices—Safety Requirements for Industrial Robots. https://www.iso.org/standard/51330.html
[25]
Scalera, L., Nainer, C., Giusti, A. and Gasparetto, A. (2023) Robust Safety Zones for Manipulators with Uncertain Dynamics in Collaborative Robotics. International JournalofComputerIntegratedManufacturing, 37, 887-899. https://doi.org/10.1080/0951192x.2023.2258111
[26]
PILZ (2023) Sensor Technology PSEN, Control and Signal Devices PIT. https://www.pilz.com/nl-BE/products/sensor-technology/safety-laser-scanner
PILZ (2008) Safe Camera System SafetyEYE. https://www.eltron.pl/uploads/manufacturer_catalogs/16/10338/SafetyEYE_EN.pdf?srsltid=AfmBOoqiT-IRSs_Jf8KRFhE649fx928_pa6PErVoTNWTAwlEpAj2916i
Kim, E., Kirschner, R., Yamada, Y. and Okamoto, S. (2020) Estimating Probability of Human Hand Intrusion for Speed and Separation Monitoring Using Interference Theory. RoboticsandComputer-IntegratedManufacturing, 61, Article 101819. https://doi.org/10.1016/j.rcim.2019.101819
[34]
Halme, R., Lanz, M., Kämäräinen, J., Pieters, R., Latokartano, J. and Hietanen, A. (2018) Review of Vision-Based Safety Systems for Human-Robot Collaboration. ProcediaCIRP, 72, 111-116. https://doi.org/10.1016/j.procir.2018.03.043
[35]
Szabo, S., Shackleford, W., Norcross, R. and Marvel, J. (2012) A Testbed for Evaluation of Speed and Separation Monitoring in a Human Robot Collaborative Environment. https://doi.org/10.6028/NIST.IR.7851
[36]
Vogel, C., Fritzsche, M. and Elkmann, N. (2016) Safe Human-Robot Cooperation with High-Payload Robots in Industrial Applications. 2016 11thACM/IEEEInternationalConferenceonHuman-RobotInteraction (HRI), Christchurch, 7-10 March 2016, 529-530. https://doi.org/10.1109/hri.2016.7451840
[37]
Vogel, C., Schulenburg, E. and Elkmann, N. (2020) Projective-AR Assistance System for Shared Human-Robot Workplaces in Industrial Applications. 2020 25thIEEEInternationalConferenceonEmergingTechnologiesandFactoryAutomation (ETFA), Vienna, 8-11 September 2020, 1259-1262. https://doi.org/10.1109/etfa46521.2020.9211953
[38]
Byner, C., Matthias, B. and Ding, H. (2019) Dynamic Speed and Separation Monitoring for Collaborative Robot Applications-Concepts and Performance. RoboticsandComputer-IntegratedManufacturing, 58, 239-252. https://doi.org/10.1016/j.rcim.2018.11.002
[39]
Scalera, L., Lozer, F., Giusti, A. and Gasparetto, A. (2024) An Experimental Evaluation of Robot-Stopping Approaches for Improving Fluency in Collaborative Robotics. Robotica, 42, 1386-1402. https://doi.org/10.1017/s0263574724000262
[40]
Andres, C.P.C., Hernandez, J.P.L., Baldelomar, L.T., Martin, C.D.F., Cantor, J.P.S., Poblete, J.P., etal. (2018) Tri-Modal Speed and Separation Monitoring Technique Using Static-Dynamic Danger Field Implementation. 2018 IEEE 10thInternationalConferenceonHumanoid, Nanotechnology, InformationTechnology, CommunicationandControl, EnvironmentandManagement (HNICEM), Baguio City, 29 November 2018-2 December 2018, 1-6. https://doi.org/10.1109/hnicem.2018.8666305
[41]
Bascetta, L., Ferretti, G., Rocco, P., Ardo, H., Bruyninckx, H., Demeester, E., etal. (2011) Towards Safe Human-Robot Interaction in Robotic Cells: An Approach Based on Visual Tracking and Intention Estimation. 2011 IEEE/RSJInternationalConferenceonIntelligentRobotsandSystems, San Francisco, 25-30 September 2011, 2971-2978. https://doi.org/10.1109/iros.2011.6094642
[42]
Marvel, J.A. and Norcross, R. (2017) Implementing Speed and Separation Monitoring in Collaborative Robot Workcells. RoboticsandComputer-IntegratedManufacturing, 44, 144-155. https://doi.org/10.1016/j.rcim.2016.08.001
[43]
Magnanimo, V., Walther, S., Tecchia, L., Natale, C. and Guhl, T. (2016) Safeguarding a Mobile Manipulator Using Dynamic Safety Fields. 2016 IEEE/RSJInternationalConferenceonIntelligentRobotsandSystems (IROS), Daejeon, 9-14 October 2016, 2972-2977. https://doi.org/10.1109/iros.2016.7759460
[44]
Ragaglia, M., Zanchettin, A.M. and Rocco, P. (2018) Trajectory Generation Algorithm for Safe Human-Robot Collaboration Based on Multiple Depth Sensor Measurements. Mechatronics, 55, 267-281. https://doi.org/10.1016/j.mechatronics.2017.12.009
[45]
(2023) ISO 13849-1: 2023: Safety of Machinery-Safety-Related Parts of Control Systems. https://www.iso.org/standard/73481.html