PROGNOSTICATION OF UNSEEN OBJECTS
USING ZERO-SHOT LEARNING WITH
A COMPLETE CASE ANALYSIS

Srinivasa L. ChakravarthyORCID logo and Jatin V.R. Arutla

Gandhi Institute of Technology and Management
Visakhapatnam, India


INDECS 20(4), 454-468, 2022
DOI 10.7906/indecs.20.4.10
Full text available in pdf pdf icon and xml XML icon formats.
 

Received: 6th July 2021.
Accepted: 2nd May 2022.
Regular article

ABSTRACT

Generally, for a machine learning model to perform well, the data instances on which the model is being trained have to be relevant to the use case. In the case of relevant samples not being available, Zero-shot learning can be used to perform classification tasks. Zero-shot learning is the process of solving a problem when there are no examples of that problem in the phase of training. It lets us classify target classes on which the deep learning model has not been trained.
In this article, Zero-shot learning is used to classify food dish classes through an object recognition model. First, the data is collected from Google Images and Kaggle. The image attributes are then extracted using a VGG16 model. The image attributes belonging to the training categories are then used to train a custom-built deep learning model. Various hypermeters of the model are tuned and the results are analyzed in order to get the best possible performance. The image attributes extracted from the zero-shot learning categories are used to test the model after the training process is completed. The predictions are made by comparing the vectors of the target class with the training classes in the Word2Vec space. The metric used to evaluate the model is Top-5 accuracy which indicates whether the expected result is present in the predictions. A Top-5 accuracy of 92% is achieved by implementing zero-shot learning for the classification of unseen food dish images.

KEY WORDS
zero-shot learning, machine translation, unseen image classification

CLASSIFICATION
JEL:Z19
PACS:07.05.Pj


This is the official web site of the Scientific Journal INDECS.
Questions, comments and suggestions please send to: indecs@indecs.eu
Last modified: 20 June 2016.