IOCAS-IR  > 海洋地质与环境重点实验室
Wound intensity correction and segmentation with convolutional neural networks
Lu, Huimin1,2,3; Li, Bin4; Zhu, Junwu4; Li, Yujie1,4; Li, Yun4; Xu, Xing5,8; He, Li6; Li, Xin3; Li, Jianru7; Serikawa, Seiichi1
AbstractWound area changes over multiple weeks are highly predictive of the wound healing process. A big data eHealth system would be very helpful in evaluating these changes. We usually analyze images of the wound bed for diagnosing injury. Unfortunately, accurate measurements of wound region changes from images are difficult. Many factors affect the quality of images, such as intensity inhomogeneity and color distortion. To this end, we propose a fast level set model-based method for intensity inhomogeneity correction and a spectral properties-based color correction method to overcome these obstacles. State-of-the-art level set methods can segment objects well. However, such methods are time-consuming and inefficient. In contrast to conventional approaches, the proposed model integrates a new signed energy force function that can detect contours at weak or blurred edges efficiently. It ensures the smoothness of the level set function and reduces the computational complexity of re-initialization. To increase the speed of the algorithm further, we also include an additive operator-splitting algorithm in our fast level set model. In addition, we consider using a camera, lighting, and spectral properties to recover the actual color. Numerical synthetic and real-world images demonstrate the advantages of the proposed method over state-of-the-art methods. Experimental results also show that the proposed model is at least twice as fast as methods used widely. Copyright (C) 2016 John Wiley & Sons, Ltd.
KeywordIllumination Correction Big Data Level Set Model Ehealth Analysis System
Indexed BySCI ; ISTP
WOS IDWOS:000398035700013
Citation statistics
Cited Times:48[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Affiliation1.Kyushu Inst Technol, Dept Elect & Elect Engn, Kitakyushu, Fukuoka 8048550, Japan
2.Chinese Acad Sci, Inst Oceanol, Qingdao 266071, Peoples R China
3.Shanghai Jiao Tong Univ, State Key Lab Ocean Engn, Shanghai 200240, Peoples R China
4.Yangzhou Univ, Sch Informat Engn, Yangzhou 225009, Jiangsu, Peoples R China
5.Kyushu Univ, Dept Informat Sci & Elect Engn, Fukuoka 8190395, Japan
6.Qualcomm R&D Ctr, San Diego, CA 92121 USA
7.Tongji Univ, State Key Lab Marine Geol, Shanghai 200092, Peoples R China
8.Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
First Author AffilicationInstitute of Oceanology, Chinese Academy of Sciences
Recommended Citation
GB/T 7714
Lu, Huimin,Li, Bin,Zhu, Junwu,et al. Wound intensity correction and segmentation with convolutional neural networks[J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE,2017,29(6).
APA Lu, Huimin.,Li, Bin.,Zhu, Junwu.,Li, Yujie.,Li, Yun.,...&Serikawa, Seiichi.(2017).Wound intensity correction and segmentation with convolutional neural networks.CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE,29(6).
MLA Lu, Huimin,et al."Wound intensity correction and segmentation with convolutional neural networks".CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE 29.6(2017).
Files in This Item:
File Name/Size DocType Version Access License
Wound intensity corr(364KB)期刊论文出版稿开放获取CC BY-NC-SAView Application Full Text
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Lu, Huimin]'s Articles
[Li, Bin]'s Articles
[Zhu, Junwu]'s Articles
Baidu academic
Similar articles in Baidu academic
[Lu, Huimin]'s Articles
[Li, Bin]'s Articles
[Zhu, Junwu]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Lu, Huimin]'s Articles
[Li, Bin]'s Articles
[Zhu, Junwu]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: Wound intensity correction and segmentation with convolutional neural networks.pdf
Format: Adobe PDF
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.