Human Generated Data

Title

Untitled (The Mecca, Chicago, Illinois)

Date

February 1950

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4337.1

Human Generated Data

Title

Untitled (The Mecca, Chicago, Illinois)

People

Artist: Ben Shahn, American 1898 - 1969

Date

February 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4337.1

Machine Generated Data

Tags

Amazon
created on 2019-04-09

Apparel 99.8
Clothing 99.8
Person 99.4
Human 99.4
Person 99
Person 93.1
Face 92.7
Dress 92.7
Kid 83.3
Child 83.3
Female 82
Girl 75.4
Nature 75.3
Hat 70.5
People 68.2
Photography 65.6
Photo 65.6
Portrait 65.6
Outdoors 63.1
Finger 60
Teen 59.5
Smoke 58
Shorts 56.9

Clarifai
created on 2018-08-28

people 99.8
child 97.6
adult 95.7
wear 95.2
group 94.3
two 94
portrait 93.2
retro 92.8
man 91
veil 90.4
woman 85.7
wedding 84.6
three 84.2
son 83.2
group together 82.9
sepia 82.8
monochrome 82.6
vintage 81.9
one 81.7
war 81.7

Imagga
created on 2018-08-28

child 29.3
sibling 28.4
grandfather 22.3
person 21.5
man 20.1
people 20.1
mother 20
portrait 18.1
parent 17.1
male 17
kin 17
family 16.9
love 16.6
happy 15.7
human 15
adult 14.9
face 14.9
brother 14.3
old 13.9
boy 13
happiness 12.5
couple 12.2
smile 12.1
childhood 11.6
home 11.2
hair 11.1
hand 10.6
body 10.4
ancient 10.4
eyes 10.3
black 10.2
world 10.2
smiling 10.1
lifestyle 10.1
girls 10
baby 9.9
dress 9.9
art 9.8
health 9.7
expression 9.4
one 9
married 8.6
dark 8.3
fashion 8.3
care 8.2
retro 8.2
religion 8.1
interior 8
women 7.9
look 7.9
antique 7.8
model 7.8
bride 7.7
youth 7.7
elderly 7.7
marriage 7.6
females 7.6
juvenile 7.6
senior 7.5
room 7.5
fun 7.5
traditional 7.5
vintage 7.4
lady 7.3
daughter 7.2
hospital 7.2
aged 7.2
sexy 7.2
looking 7.2
kid 7.1

Google
created on 2018-08-28

Microsoft
created on 2018-08-28

wall 95
person 88.5
old 80.5
posing 36.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-27
Gender Female, 97.4%
Surprised 2%
Happy 3.9%
Sad 39.5%
Disgusted 2.8%
Confused 2.6%
Calm 45.5%
Angry 3.6%

AWS Rekognition

Age 14-23
Gender Female, 96.5%
Confused 4.4%
Sad 64.4%
Surprised 5.9%
Angry 3.3%
Calm 4.6%
Happy 15.1%
Disgusted 2.2%

AWS Rekognition

Age 26-43
Gender Male, 90.8%
Happy 17.5%
Disgusted 2.2%
Angry 2.7%
Confused 5.8%
Calm 56%
Surprised 6.4%
Sad 9.4%

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft
created on 2018-08-28

an old photo of a boy 70.3%
an old photo of a girl 70.2%
an old photo of a person 70.1%