The goal of the project is to research and develop a fully automatic solution that will use a robotic manipulator and modern 3D cameras in order to obtain a precise 3D reconstruction of the part of the human body where the wound is located, and then automatically segment the wound area on the recorded volume and determine its physical parameters (circumference, surface and volume), and to classify the surface tissue of the wound and determine the percentages of each type of wound tissue (necrosis, fibrin and granulation).
Chronic wound healing is a lengthy process that can be further prolonged if ineffective treatment is used. Many countries around the world spend significant financial resources on an annual basis to treat chronic wounds. Part of these resources spent on ineffective therapies could be saved if medical staff had reliable tools that could accurately determine the current condition of the wound in order to assign the correct therapy. The effect of individual therapies and possible timely adjustment could be introduced if the same reliable and accurate tools were used at each check-up. Furthermore, such tools would allow monitoring of the complete wound healing process over its entire duration. Current commercial wound analysis systems based on one or a series of photographs are not accurate and reliable enough to analyze all types of wounds. Also, since a person is involved in taking photographs and marking the wound surface, the process of analysis may be influenced by the subjective knowledge, experience and skill of that same person. The introduction of a robotic manipulator and a fully automated system for recording and analyzing wounds would eliminate all subjective influences on the analysis, which would result in a faster, more accurate and more reliable process of wound analysis. Ultimately, treatment costs would be reduced due to fewer mistakes, patients would be more satisfied with their treatment process, and medical staff would spend less time analyzing per patient, leading to more patients being able to be treated at the same time.
There is a lot of research in the medical and technical science fields to measure the physical characteristics of wounds such as depth, circumference, area, and volume. Wound measurement is typically divided into two main approaches: contact and non-contact. Contact methods require that the measurement be performed directly in contact with the wound. In this case, rulers and transparent foils are most often used in cases of depth, circumference and surface measurement, while some form of liquids or similar materials is used to measure volume. The main disadvantage of contact methods is the inherit inaccuracy of measurements and their subjective interpretation. Contact measurement methods also increase the risk of additional wound infections and are generally not comfortable for the patient. Advances in technology have led to the development of various cameras and sensors that can be used for measurement and analysis, which has led to the research of non-contact methods that seek to analyze the wounds as accurately as possible by using its physical and other parameters. The project described here is also based on non-contact measurement of wound physical characteristics and wound tissue analysis.
This project is an interdisciplinary research where robotics and computer vision are applied to medicine and together they form a complex ensemble of ideas and solutions. Therefore, within this project, the areas of wound detection, 3D reconstruction, wound segmentation and wound tissue classification will be considered.
Within the five-year project plan, the development of an automatic wound analysis system will be divided into the development of four subsystems. The wound detection and imaging subsystem has the task of automatically locating the wound on the patient that will be analyzed and determining the optimal positions for imaging the wound with a 3D camera in order to create a complete and accurate 3D model of the wound. The subsystem for 3D reconstruction of wound surface will fit the images obtained by the operation of the previous subsystem into a precise 3D model of the wound. This subsystem will then identify potential defects on the reconstructed model such as holes in the reconstructed surface (due to a poorly placed imaging position) and then repeat the recording of those views that will ensure uniform quality of the reconstructed wound surface. The wound segmentation subsystem will have the task of accurately separating the wound area from the rest of the healthy tissue and then determining the circumference, area and volume of the wound. In the end, the subsystem for the wound tissue classification will determine the individual tissue type over the entire isolated surface of the 3D model of the wound, which will result in the percentile of granulation, fibrin and necrosis. The development of individual subsystems will require the construction of a database of annotated wound images, which will be made public at the end of the project together with instructions for use, which will make it easier for other scientists in this field to implement and evaluate their own algorithms and systems for wound analysis. During the project, wound imaging and system assessment in real conditions will be carried out in the partner institution of the project, the Clinic for Surgery of the Clinical Hospital Center Osijek.
Expected project results
The expected project results include the fully developed automatic system for wound analysis which is realized through indirect results by development of subsystems for wound detection and imaging, 3D reconstruction of wound surfaces, wound segmentation and wound classification subsystem. The last expected result of the project is the publication of a database of annotated images with instructions for public use, which will make it easier for other scientists in this field to implement and evaluate their own algorithms and systems.
The project “Methods for 3D reconstruction and analysis of chronic wounds” is funded by the Croatian Science Foundation under number UIP-2019-04-4889.