Human Generated Data

Title

Untitled (two women, seated, full-length, seashore backdrop)

Date

c.1856 - c.1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3865

Human Generated Data

Title

Untitled (two women, seated, full-length, seashore backdrop)

People

Artist: Unidentified Artist,

Date

c.1856 - c.1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3865

Machine Generated Data

Tags

Amazon
created on 2021-04-03

Person 96.6
Human 96.6
Person 95.9
Furniture 95.1
Clothing 94.5
Apparel 94.5
Person 92.2
Painting 86.1
Art 86.1
Fish 78
Animal 78
Face 68.8
Hat 64.1
Chair 61.9
Sitting 59.4

Clarifai
created on 2021-04-03

people 99.8
child 98.2
portrait 98.1
art 98.1
woman 98.1
two 97.7
adult 96.2
man 95.8
son 94.6
music 91.9
group 90.4
girl 90.2
sit 90
furniture 89.9
offspring 89.8
wear 89.4
family 89.3
boy 88.4
guitar 87.6
seat 86.2

Imagga
created on 2021-04-03

stringed instrument 45
musical instrument 38.4
guitar 36.7
acoustic guitar 28.7
person 23.3
man 22.8
black 21.8
bowed stringed instrument 20.1
people 20.1
male 19.9
adult 19.8
sexy 18.5
violin 16.8
attractive 16.8
portrait 16.2
body 13.6
passion 13.2
grunge 12.8
music 12.6
model 12.4
silhouette 12.4
kin 12.4
fashion 12.1
one 11.9
art 11.7
human 11.2
book jacket 11.2
style 11.1
sensuality 10.9
symbol 10.8
electric guitar 10.6
pretty 10.5
culture 10.2
dirty 9.9
suit 9.9
business 9.7
jacket 9.7
office 9.6
hair 9.5
dance 9.5
love 9.5
dark 9.2
studio 9.1
vintage 9.1
sensual 9.1
lady 8.9
posing 8.9
women 8.7
rock 8.7
hot 8.4
wind instrument 8.3
sport 8.2
dress 8.1
newspaper 8
couple 7.8
face 7.8
room 7.8
men 7.7
modern 7.7
expression 7.7
device 7.3
makeup 7.3
product 7.3
lifestyle 7.2
businessman 7.1

Google
created on 2021-04-03

Microsoft
created on 2021-04-03

person 98.7
clothing 94.8
text 93.6
human face 92.8
smile 67.9
old 56.1
retro 54.4
man 52.8
posing 47

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-26
Gender Female, 99.2%
Calm 76.6%
Sad 12.3%
Happy 3.9%
Angry 2.6%
Confused 1.6%
Fear 1.1%
Disgusted 1%
Surprised 0.9%

AWS Rekognition

Age 12-22
Gender Male, 93.1%
Calm 93.5%
Happy 4.1%
Sad 0.7%
Confused 0.6%
Surprised 0.6%
Angry 0.2%
Disgusted 0.2%
Fear 0%

Microsoft Cognitive Services

Age 39
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.6%
Painting 86.1%
Fish 78%

Captions

Microsoft
created on 2021-04-03

an old photo of a person 75.9%
a person posing for the camera 75.8%
old photo of a person 72.7%