image_scores {transforEmotion} | R Documentation |
Calculate image scores based on OpenAI CLIP model
Description
This function takes an image file and a vector of classes as input and calculates the scores for each class using the OpenAI CLIP model. Primary use of the function is to calculate FER scores - Facial Expession Detectection of emotions based on detected facial expression in images. In case there are more than one face in the image, the function will return the scores of the face selected using the face_selection parameter. If there is no face in the image, the function will return NA for all classes. Function uses reticulate to call the Python functions in the image.py file. If you run this package/function for the first time it will take some time for the package to setup a functioning Python virtual enviroment in the background. This includes installing Python libraries for facial recognition and emotion detection in text, images and video. Please be patient.
Usage
image_scores(image, classes, face_selection = "largest")
Arguments
image |
The path to the image file or URL of the image. |
classes |
A character vector of classes to classify the image into. |
face_selection |
The method to select the face in the image. Can be "largest" or "left" or "right". Default is "largest" and will select the largest face in the image. "left" and "right" will select the face on the far left or the far right side of the image. Face_selection method is irrelevant if there is only one face in the image. |
Value
A data frame containing the scores for each class.
Author(s)
Aleksandar Tomašević <atomashevic@gmail.com>