Human Generated Data

Title

Untitled (Somerset, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2654

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Somerset, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2654

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Furniture 100
Clothing 100
Adult 99.7
Male 99.7
Man 99.7
Person 99.7
Sitting 98.5
Adult 97.2
Male 97.2
Man 97.2
Person 97.2
Brick 96.4
Indoors 96.1
Restaurant 96.1
Bench 94.2
Face 88.1
Head 88.1
Person 80.6
Plant 78.8
Cap 73.7
Hat 67
Photography 65.8
Portrait 65.8
Coat 61.7
Diner 57.3
Food 57.3
Formal Wear 56.7
Suit 56.7
Potted Plant 56.5
People 55.6
Cafeteria 55.1

Clarifai
created on 2018-05-10

people 99.6
adult 95.5
sit 95.4
man 95
one 93.6
furniture 89.7
indoors 87.7
woman 87
portrait 84.7
child 84.5
chair 80.1
war 79.8
newspaper 79.8
monochrome 79.5
boy 78.9
room 78.5
seat 77.9
nostalgia 77.3
family 76.9
actor 75.7

Imagga
created on 2023-10-06

newspaper 40.8
blackboard 35
product 34.5
creation 27
office 21.4
business 21.2
man 19.5
male 19.1
daily 18.9
businessman 17.6
person 17.2
people 15.6
adult 15.6
old 15.3
building 15.2
laptop 14.6
vintage 13.2
computer 13.1
architecture 12.5
shop 12.3
room 12.2
wall 12
travel 12
black 11.4
professional 11.1
window 10.7
sitting 10.3
finance 10.1
symbol 10.1
book jacket 9.9
job 9.7
working 9.7
jacket 9.6
men 9.4
portrait 9.1
work 8.9
success 8.8
indoors 8.8
corporate 8.6
art 8.6
culture 8.5
scholar 8.4
modern 8.4
sign 8.3
alone 8.2
technology 8.2
religion 8.1
looking 8
home 8
post 7.6
door 7.6
house 7.5
outdoors 7.5
silhouette 7.4
letter 7.3
smiling 7.2
history 7.1
interior 7.1
notebook 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

building 99.3
outdoor 95.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 36-44
Gender Female, 96.6%
Fear 73.8%
Sad 60.4%
Surprised 7.4%
Calm 6.6%
Confused 2%
Disgusted 1.3%
Angry 0.9%
Happy 0.9%

AWS Rekognition

Age 57-65
Gender Male, 100%
Calm 86%
Sad 8.9%
Surprised 6.3%
Fear 6%
Confused 1.2%
Angry 1.1%
Happy 0.6%
Disgusted 0.1%

AWS Rekognition

Age 23-31
Gender Female, 73.6%
Calm 54.9%
Fear 18.8%
Happy 15.9%
Surprised 7%
Sad 4.3%
Confused 1.8%
Disgusted 0.8%
Angry 0.6%

Microsoft Cognitive Services

Age 66
Gender Male

Microsoft Cognitive Services

Age 66
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Possible

Feature analysis

Amazon

Adult 99.7%
Male 99.7%
Man 99.7%
Person 99.7%
Bench 94.2%
Plant 78.8%
Hat 67%
Coat 61.7%

Categories

Text analysis

Amazon

MILLS
DRINK
DD
MILK
100m

Google

MILLS DRINK
MILLS
DRINK