Comparison of EfficientNetB3 and MobileNetV2 Methods for Identification of Fruit Types Using Leaf Features

EfficientNetB3 and MobileNetv2 methods


  • fiqry zaelani fiqry zaelani s
  • Yusup Miftahuddin Institut Teknologi Nasional



Today's technology is growing rapidly in various fields. One of the benefits of technological developments are helping human to work in various fields, for instance, in the field of plantations, development for the quality of fruits and even in the field of education. One of them is identifying the types of fruits that are needed by citizen even children. They can easily distinguish the types of fruits by looking at the shape of the leaves, so that it can help to increase their knowledge about fruits. For ordinary people, it must be quite difficult to know what kind of fruit on its leaf. Therefore, this study proposes a Convolution Neural Network proposal by comparing the architecture of EfficientNet-B3 and MobileNet-V2 by setting several parameters to get the best accuracy value in detecting fruit types using leaf features. EfficientNet-B3 and MobileNet-V2 are pre-trained models from CNN that tell a fairly large dataset, namely ImageNet. The results obtained from this study are applied several parameters such as the use of epoch, optimizer Adam, optimizer Adamax, optimizer sgd, bathsize. For EfficientNet-B3 epoch 20 optimizer sgd produces an accuracy of 0.2370 or 23%, while EfficientNet-B3 epoch 50 optimizer Adamax produces an accuracy of 0.3051 or 30%. In addition, research on the MobileNet-V2 epoch 20 optimizer Adam resulted in an accuracy of 0.9914 or 99%, while the MobileNet-V2 epoch 50 optimizer Adamax resulted in an accuracy of 0.9860 or 98%.

Keywords: Leaf, Convolution Neural Network, EfficientNet-B3, MobileNet-V2


Download data is not yet available.



How to Cite

fiqry zaelani and Y. . Miftahuddin, “Comparison of EfficientNetB3 and MobileNetV2 Methods for Identification of Fruit Types Using Leaf Features: EfficientNetB3 and MobileNetv2 methods”, jitter, vol. 9, no. 1, Dec. 2022.