Human Generated Data

Title

Untitled (four men in hats and suits)

Date

1928

People

Artist: Hamblin Studio, American active 1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1914

Human Generated Data

Title

Untitled (four men in hats and suits)

People

Artist: Hamblin Studio, American active 1930s

Date

1928

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1914

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 99.7
Human 99.7
Person 99.6
Person 99.5
Person 98.5
Shoe 94.9
Footwear 94.9
Clothing 94.9
Apparel 94.9
People 77.6
Coat 72.4
Person 68.7
Steamer 68.3
Outdoors 67.3
Overcoat 67
Sailor Suit 65.6
Portrait 61.3
Photography 61.3
Face 61.3
Photo 61.3
Suit 59.4
Nature 57.4

Clarifai
created on 2023-10-25

people 99.9
lid 99
child 98.7
group together 98.4
veil 98
group 97.4
man 96.9
boy 96.1
nostalgia 95.5
three 94.6
outfit 94.5
adult 93.3
uniform 92.9
several 92.6
four 92
two 91.9
nostalgic 90.6
wear 89.2
monochrome 88.7
five 87.6

Imagga
created on 2021-12-14

man 26.9
people 25.1
male 22
silhouette 21.5
person 20.7
sunset 18.9
adult 18.2
television 18.2
black 15.7
youth 15.3
sport 15.1
portrait 14.2
outdoor 13.8
world 13.6
musical instrument 13.6
telecommunication system 13.5
love 13.4
couple 13.1
happy 12.5
boy 12.2
men 12
fun 12
athlete 11.9
happiness 11.8
sky 11.5
body 11.2
player 11.1
women 11.1
lifestyle 10.8
businessman 10.6
fashion 10.6
together 10.5
leisure 10
hand 9.9
human 9.7
summer 9.6
life 9.3
joy 9.2
teenager 9.1
girls 9.1
attractive 9.1
exercise 9.1
outdoors 9
wind instrument 8.9
posing 8.9
bride 8.8
dance 8.7
party 8.6
motion 8.6
walking 8.5
face 8.5
dark 8.4
teen 8.3
dress 8.1
fitness 8.1
shadow 8.1
kin 8.1
light 8
romantic 8
celebration 8
art 7.9
child 7.8
ballplayer 7.8
model 7.8
accordion 7.7
run 7.7
health 7.6
dusk 7.6
keyboard instrument 7.6
relax 7.6
friendship 7.5
evening 7.5
action 7.4
style 7.4
lady 7.3
group 7.3
pose 7.2
sexy 7.2
active 7.2
contestant 7.2
family 7.1
dancer 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

text 98.6
clothing 97.3
man 93
person 88.7
gallery 83.8
old 61.8
posing 61.1
clothes 21.8

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 94.7%
Calm 63.7%
Confused 18.9%
Sad 11.9%
Happy 2%
Surprised 1.8%
Fear 1%
Disgusted 0.5%
Angry 0.3%

AWS Rekognition

Age 23-37
Gender Male, 81.1%
Calm 97.9%
Sad 1.2%
Angry 0.3%
Surprised 0.2%
Happy 0.2%
Confused 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 24-38
Gender Male, 93.9%
Calm 94.4%
Sad 2.2%
Surprised 1.3%
Fear 0.6%
Confused 0.6%
Happy 0.4%
Angry 0.4%
Disgusted 0.1%

AWS Rekognition

Age 32-48
Gender Male, 77%
Sad 33.7%
Calm 30.8%
Happy 30%
Confused 2.7%
Fear 1.2%
Surprised 1.1%
Disgusted 0.3%
Angry 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Shoe 94.9%

Categories