Human Generated Data

Title

Untitled (large crowd of people taking dance lesson in large hall)

Date

1949

People

Artist: Jack Gould, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15168

Human Generated Data

Title

Untitled (large crowd of people taking dance lesson in large hall)

People

Artist: Jack Gould, American

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.15168

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Clothing 99.9
Apparel 99.9
Person 98.9
Human 98.9
Person 98.3
Robe 95.2
Fashion 95.2
Wedding 93.9
Gown 93.8
Person 92.4
Person 91.5
Person 88.1
Poster 88.1
Advertisement 88.1
Person 87.2
Bridegroom 86.4
Bride 84.1
Wedding Gown 84.1
Person 78.8
Person 73.6
Female 71.3
Text 70.8
Person 70.2
People 62
Photography 60.3
Photo 60.3
Indoors 59
Person 58.1
Woman 56

Clarifai
created on 2023-10-29

people 99.9
group 99.2
adult 98.3
dancing 97.7
man 96.9
woman 96.8
wear 94
music 93
dancer 92.6
many 88.6
group together 87.7
musician 86.9
education 86.5
veil 85.6
actor 84.5
actress 81.5
interaction 80.7
portrait 80.7
leader 80
child 79.5

Imagga
created on 2022-03-05

newspaper 33.8
people 26.7
product 26
locker 24.5
home 23.9
person 23.5
man 21.5
creation 20.9
interior 20.3
fastener 19.6
male 19.1
adult 18.5
indoors 18.4
room 17.9
indoor 17.3
portrait 16.2
restraint 14.9
window 14.9
door 14.3
house 14.2
case 13.7
happy 13.1
device 12.9
smile 12.1
men 12
inside 12
pretty 11.9
women 11.9
blackboard 11.7
business 11.5
light 11.4
sliding door 11.2
refrigerator 10.8
black 10.8
one 10.4
looking 10.4
domestic 10.2
model 10.1
attractive 9.8
film 9.7
lady 9.7
office 9.6
smiling 9.4
family 8.9
standing 8.7
work 8.6
happiness 8.6
elegant 8.6
clothing 8.5
face 8.5
modern 8.4
elegance 8.4
white goods 8
businessman 7.9
bride 7.7
building 7.6
fashion 7.5
human 7.5
decoration 7.4
appliance 7.4
alone 7.3
home appliance 7.3
board 7.2
sexy 7.2
dress 7.2
lifestyle 7.2
cute 7.2
life 7.2
kitchen 7.1
working 7.1
architecture 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.2
clothing 95.8
person 94.7
dance 92.1
woman 79.1
footwear 73.1
cartoon 53.9

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 28-38
Gender Female, 98.4%
Sad 93.1%
Calm 4.2%
Angry 0.7%
Disgusted 0.5%
Surprised 0.5%
Confused 0.4%
Fear 0.4%
Happy 0.2%

AWS Rekognition

Age 35-43
Gender Male, 73.3%
Calm 46.6%
Disgusted 17.6%
Confused 13.7%
Sad 9.9%
Angry 4.9%
Surprised 4.1%
Happy 2.6%
Fear 0.7%

AWS Rekognition

Age 29-39
Gender Female, 58.8%
Calm 63.6%
Sad 19.8%
Confused 6.6%
Angry 3.7%
Disgusted 3.7%
Fear 1.2%
Surprised 0.9%
Happy 0.6%

AWS Rekognition

Age 31-41
Gender Female, 81.9%
Happy 61.4%
Surprised 12.9%
Calm 6.8%
Fear 6.4%
Sad 5.8%
Angry 3.2%
Disgusted 1.8%
Confused 1.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Poster
Person 98.9%
Person 98.3%
Person 92.4%
Person 91.5%
Person 88.1%
Person 87.2%
Person 78.8%
Person 73.6%
Person 70.2%
Person 58.1%
Poster 88.1%

Categories

Imagga

interior objects 71.9%
paintings art 21.8%
text visuals 4.2%