Human Generated Data

Title

Untitled (Native American man, standing, wearing traditional dress labeled (?) Grey Bear)

Date

1879-1889

People

Artist: F. Jay Haynes, American 1853-1921

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Harvard College Library from the Bequest of Evert Jansen Wendell, 2.2002.4083

Human Generated Data

Title

Untitled (Native American man, standing, wearing traditional dress labeled (?) Grey Bear)

People

Artist: F. Jay Haynes, American 1853-1921

Date

1879-1889

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Harvard College Library from the Bequest of Evert Jansen Wendell, 2.2002.4083

Machine Generated Data

Tags

Amazon
created on 2019-11-10

Apparel 100
Clothing 100
Fashion 99.6
Robe 99.6
Gown 98.5
Kimono 96.6
Person 95.9
Human 95.9
Dress 70.4
Evening Dress 63
Costume 59.7
Art 56
Painting 56

Clarifai
created on 2019-11-10

people 98
retro 96.2
woman 96
adult 96
man 95.8
art 95.7
one 95.1
wear 95
old 92.1
illustration 91.7
vintage 90.6
portrait 89.9
desktop 89.6
antique 89.5
ancient 87.8
sculpture 85
bill 84.5
sepia pigment 81.7
paper 80.7
vector 79.8

Imagga
created on 2019-11-10

person 20.7
crutch 18.2
old 18.1
human 18
vintage 17.4
dress 17.2
body 16
anatomy 15.5
ruler 15.1
skeleton 14.6
people 14.5
man 14.1
staff 14.1
model 14
art 13.8
fashion 13.6
paper 13.5
golfer 13.4
portrait 12.9
black 12.7
grunge 11.9
bones 11.8
adult 11.7
stick 11.7
biology 11.4
lady 11.4
style 11.1
texture 11.1
player 10.9
drawing 10.8
retro 10.7
medical 10.6
male 9.9
attractive 9.8
antique 9.8
skull 9.8
sketch 9.4
pose 9.1
health 9
design 9
child 8.9
medicine 8.8
hair 8.7
contestant 8.6
document 8.5
elegance 8.4
holding 8.3
brown 8.1
science 8
spine 7.8
ancient 7.8
empty 7.7
blank 7.7
frame 7.5
clothes 7.5
stucco 7.4
blond 7.3
aged 7.2
sexy 7.2
posing 7.1

Google
created on 2019-11-10

Microsoft
created on 2019-11-10

clothing 97.6
person 96.5
text 96.3
old 84
woman 78
human face 77.4
footwear 75.7
girl 56.9
smile 56
fashion 55.9
vintage 29.3
picture frame 8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-37
Gender Male, 81%
Disgusted 0%
Sad 0.1%
Happy 0.2%
Surprised 0%
Angry 0.1%
Calm 99.6%
Confused 0%
Fear 0%

Microsoft Cognitive Services

Age 38
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 95.9%
Painting 56%

Categories

Imagga

interior objects 92.9%
paintings art 6.2%

Text analysis

Google

IM
IM