Human Generated Data

Title

Untitled (Marion, Ohio)

Date

July 1938-August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.219

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Marion, Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

July 1938-August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.219

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Person 98.5
Person 96.7
Person 96.4
Person 94.1
Person 93.8
Indoors 88
Restaurant 88
Book 82.4
Publication 82.4
Clothing 81.3
Jeans 81.3
Pants 81.3
Face 80.5
Head 80.5
Shop 74.6
Person 65.7
Hat 57.7
Comics 57.6
Text 57.4
Diner 57.1
Food 57.1
Advertisement 56.6
Poster 56.6
Cap 56
Coat 55.8
Newsstand 55.3

Clarifai
created on 2018-05-11

desktop 91
technology 90.4
industry 87.9
retro 87.3
old 85.8
vintage 85.1
illustration 83.4
equipment 83.1
data 81.7
classic 80.3
no person 78.5
cool 78.1
power 76.8
security 76.8
computer 75.2
design 74.8
mix 74.7
dirty 72.7
art 72.7
machine 72.5

Imagga
created on 2023-10-05

musical instrument 56.6
accordion 53.2
keyboard instrument 42.5
wind instrument 33.5
equipment 22.2
device 21.6
metal 19.3
black 19.2
old 16
business 15.2
technology 14.8
music 14.4
vintage 12.4
vehicle 12.3
air conditioner 12.2
server 12
modern 11.9
power 11.7
steel 11.5
retro 11.5
digital 11.3
computer 11.3
mechanism 10.9
transportation 10.7
radio 10.4
object 10.2
car 10.2
cooling system 9.9
machine 9.7
information 9.7
grille 9.7
musical 9.6
auto 9.6
design 9.6
antique 9.5
chrome 9.4
instrument 9.3
classic 9.3
box 9.1
building 8.7
architecture 8.6
close 8.6
3d 8.5
sound 8.4
home 8
camera 8
silver 7.9
text 7.8
storage 7.6
store 7.5
horizontal 7.5
pattern 7.5
industrial 7.3
shiny 7.1

Google
created on 2018-05-11

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Male, 98.3%
Calm 71.1%
Happy 12.2%
Surprised 9.8%
Fear 7.4%
Disgusted 2.7%
Sad 2.6%
Angry 1.7%
Confused 1.5%

AWS Rekognition

Age 35-43
Gender Female, 99.7%
Happy 96%
Surprised 6.8%
Fear 6%
Sad 2.2%
Angry 0.9%
Calm 0.8%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 29-39
Gender Female, 100%
Happy 92.3%
Surprised 6.8%
Fear 6.1%
Sad 2.5%
Calm 2%
Confused 1.2%
Disgusted 1.1%
Angry 0.6%

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Jeans 81.3%

Categories

Imagga

paintings art 99%

Text analysis

Amazon

THIS
THIS WEEK'S
WEEK'S
ATTRACTION
Royal

Google

THIS WEEK'S ATTRACTION
THIS
WEEK'S
ATTRACTION