Human Generated Data

Title

Untitled (photograph of portrait of older woman and girl beside potted flowers)

Date

c. 1930, printed later

People

Artist: Curtis Studio, American active 1891 - 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13164

Human Generated Data

Title

Untitled (photograph of portrait of older woman and girl beside potted flowers)

People

Artist: Curtis Studio, American active 1891 - 1935

Date

c. 1930, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.13164

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 98.6
Human 98.6
Person 98.2
Art 94.6
Painting 93.3
Clothing 92.9
Apparel 92.9
Coat 57.6

Clarifai
created on 2023-10-26

people 100
portrait 99
group 98.6
art 97.4
adult 97.2
wear 96.9
two 95.8
child 94
three 94
uniform 93.6
administration 92.5
woman 92.4
man 92
offspring 87.7
leader 87.3
family 87.2
prince 87.1
royalty 86.4
vintage 86.2
military 85.8

Imagga
created on 2022-01-22

kin 36.6
people 20.1
person 18.3
old 18.1
art 16.3
world 16
vintage 15.7
dress 15.4
adult 14.9
man 14.8
religion 14.3
statue 14.3
fashion 13.6
sculpture 13.5
couple 13.1
portrait 12.9
black 12.9
sibling 11.5
love 11
male 11
happy 10.6
antique 10.5
sexy 10.4
ancient 10.4
model 10.1
military uniform 9.9
face 9.9
child 9.9
human 9.7
one 9.7
culture 9.4
grunge 9.4
history 8.9
clothing 8.8
standing 8.7
uniform 8.6
style 8.2
posing 8
interior 8
hair 7.9
window 7.9
women 7.9
smile 7.8
happiness 7.8
boy 7.8
holy 7.7
historical 7.5
religious 7.5
brother 7.5
monument 7.5
church 7.4
retro 7.4
light 7.3
figure 7.3
body 7.2

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

clothing 99.2
text 98.8
old 96.7
person 95.1
man 90.4
black 83.3
posing 77.1
white 73
human face 57.3
photograph 55.9
vintage 52.4
picture frame 27.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-16
Gender Male, 98.7%
Calm 99.8%
Confused 0.1%
Surprised 0%
Angry 0%
Sad 0%
Fear 0%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 33-41
Gender Female, 99.9%
Confused 90.5%
Calm 7.6%
Happy 1.1%
Surprised 0.2%
Sad 0.2%
Angry 0.1%
Disgusted 0.1%
Fear 0.1%

Microsoft Cognitive Services

Age 40
Gender Female

Microsoft Cognitive Services

Age 24
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Painting 93.3%
Coat 57.6%