Human Generated Data

Title

Untitled (five cheerleaders with hands on hips)

Date

1953-1954

People

Artist: Harry Annas, American 1897 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3175

Human Generated Data

Title

Untitled (five cheerleaders with hands on hips)

People

Artist: Harry Annas, American 1897 - 1980

Date

1953-1954

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3175

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 97.2
Human 97.2
Person 96.7
Person 96
Person 95
Clothing 92.9
Apparel 92.9
Person 89.2
Toy 79.5
Doll 79
Figurine 73.1
Girl 58.2
Female 58.2
Room 56.9
Indoors 56.9

Clarifai
created on 2023-10-26

people 99.8
group 99
wear 98.9
dancer 97.4
dancing 96.6
man 96.6
woman 95.6
adult 95.1
actor 93.7
group together 93.6
monochrome 91.7
veil 91.7
child 90.9
music 90.7
art 90.7
costume 90.2
outfit 88.6
dress 88
actress 86.7
girl 85.4

Imagga
created on 2022-01-22

people 26.2
man 22.5
adult 21.5
male 19.2
person 19
men 16.3
brass 15.7
women 15
black 13.8
case 13.3
wind instrument 12.9
business 12.7
love 12.6
silhouette 12.4
couple 12.2
party 12
youth 11.9
happiness 11.7
fashion 11.3
happy 11.3
celebration 11.2
dress 10.8
holding 10.7
pretty 10.5
group 10.5
professional 10.4
clothing 10.3
bride 9.9
attractive 9.8
portrait 9.7
art 9.5
musical instrument 9.3
bouquet 9.1
indoor 9.1
human 9
businessman 8.8
together 8.8
model 8.5
elegance 8.4
sport 8.4
hand 8.3
dance 8.3
teenager 8.2
girls 8.2
family 8
lifestyle 7.9
indoors 7.9
design 7.9
dancer 7.8
modern 7.7
husband 7.6
two 7.6
wife 7.6
style 7.4
symbol 7.4
wedding 7.3
color 7.2
suit 7.2
looking 7.2
active 7.2
home 7.2
work 7.1
smile 7.1
job 7.1
interior 7.1
day 7.1

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

wall 96.7
clothing 86.8
window 85.6
text 84.5
posing 78.7
white 77.9
person 76.5
dance 70.3
footwear 59.4
old 40.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 92.1%
Happy 94.9%
Sad 2.4%
Fear 1%
Surprised 0.9%
Confused 0.3%
Disgusted 0.2%
Calm 0.2%
Angry 0.2%

AWS Rekognition

Age 26-36
Gender Female, 58.5%
Happy 52.9%
Calm 21.4%
Sad 19.7%
Confused 2.3%
Surprised 1.5%
Disgusted 1.1%
Angry 0.6%
Fear 0.5%

AWS Rekognition

Age 31-41
Gender Male, 99.1%
Happy 65.2%
Calm 30.7%
Surprised 2.3%
Confused 0.5%
Disgusted 0.3%
Fear 0.3%
Sad 0.3%
Angry 0.2%

AWS Rekognition

Age 34-42
Gender Female, 54.4%
Calm 64.2%
Happy 27.6%
Surprised 3.1%
Confused 1.7%
Sad 1.4%
Disgusted 1.3%
Angry 0.5%
Fear 0.2%

AWS Rekognition

Age 30-40
Gender Female, 80.6%
Happy 77.2%
Calm 19%
Surprised 2.4%
Sad 0.4%
Disgusted 0.3%
Fear 0.2%
Angry 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 97.2%

Text analysis

Amazon

RODVR
RODVR -COVEETA
of
-COVEETA