Human Generated Data

Title

Untitled (Somerset, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2035

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Somerset, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2035

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Furniture 100
Clothing 99.9
Adult 99.6
Male 99.6
Man 99.6
Person 99.6
Brick 98.9
Sitting 98.7
Coat 98.2
Person 97
Indoors 94
Restaurant 94
Face 91.8
Head 91.8
Cap 90.1
Person 90
Bench 83.2
Hat 69
Diner 56.6
Food 56.6
Plant 56.5
Photography 56.4
Portrait 56.4
Baseball Cap 56.2
Cafeteria 56.1
Jacket 55.9

Clarifai
created on 2018-05-10

people 99.9
adult 97.7
one 97.5
man 96.3
administration 91.1
street 90.1
furniture 89.7
sit 87.8
two 87.6
wear 87.2
chair 86.9
group 86.5
woman 85.4
newspaper 83.4
war 82.7
boy 82.4
portrait 81.3
outfit 81.2
group together 79.7
child 79.7

Imagga
created on 2023-10-05

newspaper 43
product 36.6
creation 28.6
man 18.8
daily 18.3
business 18.2
office 18.1
person 17.9
shop 17.6
old 16.7
male 16.3
building 14.9
laptop 14.6
people 13.9
businessman 13.2
vintage 13.2
adult 13
blackboard 12.6
architecture 12.5
computer 12.4
working 12.4
black 11.4
city 10.8
job 10.6
art 10.5
mercantile establishment 10.3
finance 10.1
work 9.8
sign 9.8
window 9.5
book jacket 9.3
travel 9.1
history 8.9
symbol 8.7
men 8.6
barbershop 8.5
room 8.4
park 8.2
alone 8.2
one 8.2
financial 8
chair 7.9
washboard 7.8
professional 7.7
sitting 7.7
device 7.5
outdoors 7.5
letter 7.3
aged 7.2
jacket 7.2
worker 7.2
scholar 7.2
portrait 7.1
modern 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

building 99.6
outdoor 98.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-51
Gender Female, 52.1%
Fear 32.4%
Surprised 26.6%
Calm 20.3%
Angry 10.4%
Confused 7.6%
Disgusted 6.5%
Sad 4%
Happy 0.8%

AWS Rekognition

Age 57-65
Gender Male, 100%
Calm 97.2%
Surprised 6.3%
Fear 5.9%
Sad 2.6%
Angry 0.6%
Confused 0.4%
Happy 0.2%
Disgusted 0.1%

AWS Rekognition

Age 29-39
Gender Male, 67.5%
Sad 81.3%
Fear 39.9%
Surprised 12.9%
Disgusted 6.9%
Angry 4.4%
Confused 3.6%
Happy 2.4%
Calm 1.2%

Microsoft Cognitive Services

Age 55
Gender Male

Microsoft Cognitive Services

Age 59
Gender Male

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.6%
Male 99.6%
Man 99.6%
Person 99.6%
Bench 83.2%
Hat 69%
Plant 56.5%

Categories

Text analysis

Amazon

MILLS
DRINK
5
100m
THE

Google

ILL
ILL