Human Generated Data

Title

Untitled (family portrait)

Date

c. 1880

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.32.2

Human Generated Data

Title

Untitled (family portrait)

People

Artist: Unidentified Artist,

Date

c. 1880

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.32.2

Machine Generated Data

Tags

Amazon
created on 2019-11-07

Human 98.5
Person 98.5
Person 98.3
Person 96.6
Person 94.8
Art 91.3
Outdoors 76.3
Wood 75.8
Soil 71
Nature 67.1
Face 65.5
Photo 62
Portrait 62
Photography 62
Drawing 61.8
Sand 55.9

Clarifai
created on 2019-11-07

people 99.9
group 99.1
adult 98.9
woman 97.9
man 97
wear 96.8
art 95.4
portrait 93.5
print 92.7
music 88.5
illustration 88.4
two 88.2
child 87.1
facial expression 86.6
furniture 85.3
one 85.2
outfit 84.6
many 83.9
leader 80.6
four 80.4

Imagga
created on 2019-11-07

sketch 67
drawing 52.7
representation 45.5
art 19.6
statue 16.9
old 16.7
sculpture 16.5
portrait 16.2
face 14.9
hair 14.3
vintage 14.1
model 14
people 13.4
black 13.3
body 12.8
dress 12.6
book jacket 12.6
fashion 12.1
pretty 11.9
attractive 11.9
adult 11.7
person 11.5
antique 11.2
sexy 11.2
ancient 11.2
culture 11.1
jacket 10.9
head 10.9
symbol 10.8
detail 10.5
brunette 10.4
monument 10.3
religion 9.9
history 9.8
posing 9.8
historical 9.4
religious 9.4
figure 9.3
makeup 9.1
design 9
style 8.9
decoration 8.7
doll 8.4
famous 8.4
color 8.3
historic 8.2
sensuality 8.2
closeup 8.1
women 7.9
cute 7.9
architecture 7.8
marble 7.7
human 7.5
feminine 7.5
wrapping 7.4
tourism 7.4
retro 7.4
sensual 7.3
make 7.3
male 7.1

Google
created on 2019-11-07

Photograph 95.7
Text 86.9
Art 72.1
Collection 63.8
Photography 62.4
Gentleman 61.7
Picture frame 57.1

Microsoft
created on 2019-11-07

room 100
scene 100
gallery 99.9
person 96.4
man 95.7
clothing 94.6
drawing 90.9
sketch 82.5
text 82.4
human face 61.8
posing 51
different 39.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 55%
Happy 45%
Calm 45.6%
Surprised 45%
Sad 45.9%
Angry 53.2%
Disgusted 45%
Confused 45%
Fear 45.2%

AWS Rekognition

Age 12-22
Gender Female, 54.8%
Calm 45.5%
Angry 45%
Disgusted 45%
Happy 45%
Sad 45.7%
Fear 53.2%
Confused 45.2%
Surprised 45.3%

AWS Rekognition

Age 23-37
Gender Female, 50.4%
Happy 45.1%
Disgusted 45%
Sad 45.1%
Fear 45.1%
Calm 54.2%
Confused 45.1%
Angry 45.1%
Surprised 45.3%

AWS Rekognition

Age 27-43
Gender Female, 51.9%
Disgusted 45.1%
Sad 45.4%
Angry 47.4%
Calm 46.5%
Happy 45.1%
Confused 49.4%
Fear 45.3%
Surprised 45.7%

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 38
Gender Male

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 18
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Likely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%

Categories

Imagga

paintings art 96.9%
pets animals 2.4%

Text analysis

Amazon

is
is Vorary
Vorary

Google

33-20 tish womaw
33-20
tish
womaw