Human Generated Data

Title

Untitled (five women gathered outside, one is seated on edges of railing)

Date

c. 1935

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12992

Human Generated Data

Title

Untitled (five women gathered outside, one is seated on edges of railing)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12992

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 99.2
Human 99.2
Person 99.2
Person 99
Person 98.6
Person 98.2
Clothing 92.8
Apparel 92.8
Dress 80.4
Chair 79.9
Furniture 79.9
Female 79.6
People 75.7
Face 64.9
Portrait 64.7
Photography 64.7
Photo 64.7
Crowd 64.4
Chess 62.1
Game 62.1
Girl 61.2
Woman 58.6
Priest 56.1
Person 55.4

Clarifai
created on 2023-10-29

people 100
group 99.1
adult 98.2
woman 97.1
wear 94.8
man 94.6
ceremony 93.8
group together 92.5
outfit 92.3
several 92.1
leader 92
music 90.3
clergy 89.6
child 89.1
many 87.6
priest 87.5
administration 86.3
actress 81.2
uniform 79.9
veil 79

Imagga
created on 2022-02-05

barbershop 39.3
shop 29.7
man 26.2
mercantile establishment 23.6
people 22.9
male 21.3
person 19.8
couple 19.2
kin 17.9
place of business 15.7
portrait 14.2
happy 13.8
musical instrument 13
love 12.6
men 12
old 11.8
adult 11.8
happiness 11.8
bride 11.5
businessman 11.5
black 11.4
life 11.1
room 10.9
family 10.7
business 10.3
women 10.3
newspaper 10.3
vintage 9.9
grunge 9.4
two 9.3
wedding 9.2
silhouette 9.1
dress 9
group 8.9
groom 8.8
smiling 8.7
lifestyle 8.7
face 8.5
wind instrument 8.3
human 8.2
retro 8.2
mother 8.1
romantic 8
home 8
art 7.9
establishment 7.8
smile 7.8
antique 7.8
color 7.8
party 7.7
head 7.6
style 7.4
world 7.4
aged 7.2
product 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 95.8
clothing 91.5
person 90.7
wedding dress 83.9
woman 79.1
wedding 78.2
bride 75.8
old 69.3
dress 60.8
posing 37.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 23-31
Gender Male, 97.3%
Sad 42.7%
Happy 21.3%
Calm 21.1%
Confused 4.3%
Disgusted 3.8%
Fear 2.7%
Surprised 2.5%
Angry 1.5%

AWS Rekognition

Age 28-38
Gender Male, 64.1%
Calm 53.3%
Surprised 20.9%
Happy 13%
Disgusted 4.4%
Confused 3.4%
Sad 2.5%
Fear 1.7%
Angry 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Chess
Person 99.2%
Person 99.2%
Person 99%
Person 98.6%
Person 98.2%
Person 55.4%
Chess 62.1%

Categories