Human Generated Data

Title

Untitled (Susanna Shahn)

Date

Fall 1936

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4374.2

Human Generated Data

Title

Untitled (Susanna Shahn)

People

Artist: Ben Shahn, American 1898 - 1969

Date

Fall 1936

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.4374.2

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Sleeping 99.8
Person 96.2
Baby 96.2
Body Part 96
Finger 96
Hand 96
Face 95.4
Head 95.4
Photography 95.4
Portrait 95.4
Furniture 86.6
Blanket 75.5
Art 57.5
Chair 56.6
Bed 56.2
Couch 56.2
Home Decor 56
Drawing 55.3

Clarifai
created on 2018-05-09

people 99.7
adult 98.8
one 98.6
portrait 97
man 96.4
wear 96.4
monochrome 94.6
offense 91.3
woman 89.4
no person 88.2
old 86.3
art 85.9
texture 84.8
vehicle 83.9
vintage 83.8
facial expression 81.2
retro 80.2
street 80
child 77.5
veil 76.8

Imagga
created on 2023-10-06

car mirror 47.4
mirror 44.1
car 28.7
reflector 28.4
kitchen appliance 23
home appliance 22.6
vehicle 22.4
television 22.3
microwave 21.8
drive 20.1
device 19.2
transportation 17.9
automobile 17.2
appliance 16.3
auto 16.3
technology 15.6
equipment 15
road 14.4
black 14.4
travel 14.1
computer 13.4
business 12.1
telecommunication system 12.1
safety 12
driver 11.6
driving 11.6
man 11.4
digital 11.3
wheel 11.3
close 10.8
laptop 10.8
people 10.6
portrait 10.3
electronic 10.3
person 10.1
hand 9.9
adult 9.7
screen 9.5
sitting 9.4
side 9.4
inside 9.2
speed 9.2
transport 9.1
headrest 9
highway 8.7
work 8.6
monitor 8.4
durables 8.1
object 8.1
male 7.8
support 7.7
modern 7.7
old 7.7
media 7.6
traffic 7.6
power 7.6
happy 7.5
retro 7.4
entertainment 7.4
windshield 7.4
window 7.3
new 7.3
metal 7.2
looking 7.2
rest 7.1
smile 7.1
iron 7.1

Google
created on 2018-05-09

Microsoft
created on 2023-10-30

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 9-17
Gender Male, 70.3%
Sad 99.3%
Calm 17.3%
Happy 11.8%
Fear 7.4%
Surprised 6.7%
Disgusted 1.7%
Angry 1.7%
Confused 1.7%

Feature analysis

Amazon

Person 96.2%
Baby 96.2%

Categories

Imagga

cars vehicles 60.8%
food drinks 33.4%
interior objects 4.3%

Captions

Text analysis

Amazon

College
and
Art
University
(Harvard
Fellows
Harvard
of
President and Fellows of Harvard College (Harvard University Art Museums)
Museums)
P1970.4374.0002
President

Google

O President and Fellows of Harvard College (Harvard University Art Museums) P1970.4374.0002
O
President
and
Fellows
of
Harvard
College
(Harvard
University
Art
Museums)
P1970.4374.0002