Human Generated Data

Title

Untitled (Calumet, Pennsylvania)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1297

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Calumet, Pennsylvania)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1297

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Garden 100
Nature 100
Outdoors 100
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Gardening 98.2
Person 97.6
Coat 96.7
Sun Hat 96.3
Accessories 90.5
Bag 90.5
Handbag 90.5
Face 90.4
Head 90.4
Path 88.7
Gardener 82.3
Pants 80.6
Photography 77.5
Hat 77.3
Jeans 71.3
Portrait 66.1
Architecture 64.1
Building 64.1
Housing 64.1
Plant 63.5
Potted Plant 63.5
City 63.4
Yard 57.7
Road 57.3
Street 57.3
Urban 57.3
House 57.3
Porch 57.3
People 56.9
Walking 56.4
Shorts 56.4
Cap 56.3
Bonnet 56.2
Sidewalk 55.5
Baseball Cap 55.3

Clarifai
created on 2018-05-11

people 100
child 99.8
two 98.6
adult 98.5
offspring 97.6
boy 95.5
group 95.1
administration 95
sibling 94.9
man 94.4
three 93.5
group together 93.4
woman 90.6
wear 90.6
war 89.8
military 88.8
son 88.2
family 87.9
portrait 87.4
several 86.7

Imagga
created on 2023-10-05

kin 35.5
man 27.5
portrait 22
people 21.8
male 21.5
person 20.5
military uniform 20.4
love 19.7
child 19.4
couple 19.2
adult 18.1
uniform 17.6
parent 17.1
family 16.9
happiness 16.5
clothing 16.4
world 16
lifestyle 15.2
happy 15
fun 15
outdoors 14.2
mother 14.1
outdoor 13.8
dad 13
father 12.7
dress 12.6
joy 12.5
day 11.8
park 11.5
together 11.4
fashion 11.3
human 11.2
culture 11.1
hair 11.1
women 11.1
smile 10.7
attractive 10.5
two 10.2
smiling 10.1
covering 10.1
grandfather 9.8
bride 9.6
boy 9.6
walking 9.5
consumer goods 9.4
beach 9.3
groom 9.2
old 9.1
summer 9
sky 8.9
sexy 8.8
statue 8.8
face 8.5
black 8.4
traditional 8.3
city 8.3
leisure 8.3
wedding 8.3
vintage 8.3
romance 8
life 7.9
color 7.8
pretty 7.7
wall 7.7
married 7.7
casual 7.6
togetherness 7.6
relationship 7.5
action 7.4
vacation 7.4
joyful 7.3
girls 7.3
playing 7.3
danger 7.3
active 7.2
romantic 7.1
look 7
model 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 100
person 99.4
standing 86
old 49.7
posing 45.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-51
Gender Female, 100%
Calm 48.1%
Happy 43.5%
Surprised 6.6%
Fear 6.1%
Sad 3.4%
Disgusted 2.4%
Angry 0.8%
Confused 0.5%

AWS Rekognition

Age 34-42
Gender Male, 97.1%
Calm 99.7%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 6-12
Gender Female, 99.5%
Calm 47.9%
Happy 45.3%
Surprised 6.7%
Fear 6.4%
Sad 2.5%
Confused 1.9%
Angry 0.8%
Disgusted 0.6%

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Handbag 90.5%
Hat 77.3%
Jeans 71.3%

Categories

Text analysis

Amazon

ISSAN