Yang, Q.; Wang, G.; Zhang, X.; Grecos, C., and Ren, P., 2020. Coastal image captioning. In: Jung, H.-S.; Lee, S.; Ryu, J.-H., and Cui, T. (eds.), Advances in Geospatial Research of Coastal Environments. Journal of Coastal Research, Special Issue No. 102, pp. 145-150. Coconut Creek (Florida), ISSN 0749-0208.

Coastal images convey immense semantic information of the corresponding coastal areas. This article presents a preliminary approach to coastal image captioning that describes salient semantic information of coastal images with accurate and meaningful sentences. Specifically, this article exploits a state-of-the-art method referred to as self-critical sequence training for coastal image captioning. Firstly, a convolutional neural network (CNN) produces fixed-length features from the coastal images. Secondly, the fixed-length features are fed into a long short-term memory (LSTM) to generate captions, i.e., the accurate and meaningful sentences. Finally, the LSTM is optimized in the context of reinforcement learning. Experimental evaluation on the Moye island remote sensing dataset validates the effectiveness.

This content is only available as a PDF.
You do not currently have access to this content.