Human Generated Data

Title

Untitled

Date

c. 1935

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1624

Human Generated Data

Title

Untitled

People

Artist: Unidentified Artist,

Date

c. 1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-06-03

Human 99.7
Person 99.7
Person 99.3
Mining 94
Bunker 67.4
Building 67.4

Imagga
created on 2022-06-03

dark 25
water 20
black 17.5
aquarium 16.7
light 16
man 14.1
grunge 13.6
wall 12.2
television 12.1
screen 12
person 11.9
old 11.8
dirty 11.7
people 11.7
adult 11.6
night 11.5
windshield 11.5
grungy 11.4
vessel 10.9
texture 10.4
art 10.4
rain 10.4
motion 10.3
space 10.1
silhouette 9.9
fantasy 9.9
blackboard 9.4
passion 9.4
vintage 9.1
fashion 9
one 9
wet 8.9
pattern 8.9
smoke 8.4
color 8.3
protective covering 8.3
splashes 7.8
model 7.8
cold 7.7
mystery 7.7
telecommunication system 7.6
splash 7.6
drops 7.5
landscape 7.4
protection 7.3
industrial 7.3
transparent 7.2
tub 7.1
portrait 7.1
male 7.1

Microsoft
created on 2022-06-03

outdoor 99.3
person 96.4
clothing 94
man 92.8
text 84
old 77.6
white 70.2
cave 52.2

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Male, 97.2%
Calm 29.7%
Happy 19.2%
Angry 13.6%
Sad 12%
Surprised 11.3%
Fear 8.8%
Disgusted 6.9%
Confused 3.7%

AWS Rekognition

Age 22-30
Gender Female, 68.3%
Calm 87%
Surprised 7.1%
Fear 6.1%
Sad 5.6%
Angry 2.7%
Disgusted 0.8%
Confused 0.2%
Happy 0.1%

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a vintage photo of a person 91.8%
a vintage photo of a group of people standing around a bench 77.3%
a vintage photo of a group of people sitting on a bench 77.2%