Human Generated Data

Title

Untitled (unidentified woman, seated, with book and child)

Date

1890-1910

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3853

Human Generated Data

Title

Untitled (unidentified woman, seated, with book and child)

People

Artist: Unidentified Artist,

Date

1890-1910

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3853

Machine Generated Data

Tags

Amazon
created on 2019-08-06

Furniture 99.8
Chair 98.5
Human 96.4
Sitting 96.4
Person 96
Clothing 83.3
Apparel 83.3
Couch 80.7
Face 75
Person 74.9
Portrait 65.9
Photography 65.9
Photo 65.9
Art 63.6
Painting 63.6
Armchair 59.2
People 58
Text 58
Home Decor 55.8
Canvas 55.7

Clarifai
created on 2019-08-06

people 99.2
child 98.8
woman 93.8
adult 92.5
two 91.1
wear 90.6
baby 89.8
portrait 89.2
sit 87.1
family 86.7
man 86.4
art 86.4
retro 84.7
chair 84.6
room 84.5
one 83.7
group 82.9
son 80.1
furniture 79.9
vintage 79.4

Imagga
created on 2019-08-06

child 38.3
old 26.5
vintage 24
grunge 23
retro 22.9
aged 20.8
antique 20
ancient 19
frame 18.6
texture 16.7
portrait 16.2
people 15.6
person 15.5
border 15.4
home 15.2
paper 14.2
black 13.8
parent 13.8
happy 13.8
empty 13.7
mother 13.5
material 13.4
damaged 13.4
face 12.8
design 12.5
art 12.4
smiling 12.3
male 12.1
computer 12.1
man 12.1
blank 12
world 11.6
space 11.6
dad 11.5
grungy 11.4
adult 11.3
wall 11.1
house 10.9
sepia 10.7
cute 10
color 10
father 10
attractive 9.8
blackboard 9.8
room 9.8
stains 9.7
textured 9.6
film 9.6
spot 9.6
laptop 9.4
business 9.1
element 9.1
family 8.9
kin 8.8
looking 8.8
newspaper 8.8
crumpled 8.7
love 8.7
product 8.4
hand 8.4
historic 8.3
dirty 8.1
cheerful 8.1
office 8.1
decoration 8.1
lifestyle 8
indoors 7.9
text 7.9
broad 7.9
couple 7.8
boy 7.8
grime 7.8
card 7.8
mottled 7.8
decay 7.7
pretty 7.7
old fashioned 7.6
screen 7.6
sheet 7.5
structure 7.4
grain 7.4
rough 7.3
children 7.3
paint 7.2
smile 7.1
kid 7.1
interior 7.1
working 7.1
little 7.1

Google
created on 2019-08-06

Microsoft
created on 2019-08-06

clothing 98.5
baby 97.6
person 97.2
toddler 97.2
text 96.9
human face 95.2
old 88.2
smile 79.9
boy 77.3
white 68.5
black 66.9
child 63.5
posing 45.9
vintage 30.9
picture frame 6.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 16-27
Gender Male, 94.9%
Happy 0.8%
Disgusted 10.1%
Confused 1.7%
Sad 8.4%
Calm 61.7%
Angry 15.9%
Surprised 1.4%

AWS Rekognition

Age 1-5
Gender Female, 51%
Calm 54.1%
Angry 45.1%
Surprised 45.1%
Sad 45.5%
Disgusted 45%
Happy 45%
Confused 45.2%

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 3
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96%
Painting 63.6%

Text analysis

Amazon

.20013853