Human Generated Data

Title

Untitled (actor wearing pith helmet, Hedgerow, PA)

Date

c. 1938

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12008

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (actor wearing pith helmet, Hedgerow, PA)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12008

Copyright

© Estate of Joseph Janney Steinmetz

Machine Generated Data

Tags

Amazon
created on 2022-01-15

Person 97.5
Human 97.5
Face 82.3
Transportation 80
Vehicle 80
Train 80
Drawing 76.7
Art 76.7
Poster 55.4
Advertisement 55.4

Clarifai
created on 2023-10-25

people 99.7
snow 98.3
monochrome 97.5
winter 96.6
adult 96.5
man 96.5
wear 95.8
one 95.6
street 93.9
vehicle 93.8
no person 91
portrait 90.5
transportation system 85.9
cold 84.4
art 83.8
indoors 83.8
aircraft 83.2
mask 81.9
dirty 80.1
veil 80.1

Imagga
created on 2022-01-15

dishwasher 73.1
white goods 67.4
home appliance 52.3
appliance 35.9
grunge 24.7
old 23
texture 22.2
wall 17.9
durables 17.9
vintage 17.4
antique 16.4
blackboard 16.2
pattern 15.7
aged 15.4
dirty 15.4
water 15.3
device 15
frame 15
design 14.6
light 13.4
glass 13.2
retro 13.1
paint 12.7
border 12.7
damaged 12.4
grungy 12.3
ancient 12.1
paper 11.9
art 11.7
material 11.6
decay 11.6
drop 10.9
refrigerator 10.5
detail 10.5
close 10.3
space 10.1
fastener 10.1
color 10
backdrop 9.9
locker 9.8
door 9.7
metal 9.6
window 9.2
rough 9.1
black 9
cool 8.9
equipment 8.7
parchment 8.6
cold 8.6
worn 8.6
painted 8.6
decorative 8.3
clean 8.3
decoration 8.3
technology 8.2
backgrounds 8.1
digital 8.1
graphic 8
liquid 7.9
structure 7.9
architecture 7.8
blank 7.7
dirt 7.6
restraint 7.5
transparent 7.2

Google
created on 2022-01-15

Microsoft
created on 2022-01-15

black and white 95.8
text 95.5
human face 89.1
person 84.7
sketch 84.4
drawing 75.9
monochrome 60.2
white goods 54.7
appliance 53.3
woman 52.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 22-30
Gender Male, 53.3%
Fear 46.3%
Calm 19.4%
Disgusted 15%
Sad 10%
Confused 5.2%
Angry 2.2%
Happy 1.1%
Surprised 0.7%

Feature analysis

Amazon

Person 97.5%
Train 80%

Categories

Imagga

paintings art 57.7%
food drinks 41%

Captions

Text analysis

Amazon

EEE
EEE kil
YTERA
32A8 YTERA
32A8
kil

Google

EEEM
EEEM