Human Generated Data

Title

Untitled (New York City)

Date

1930s

People

Artist: Joseph Kaplan, American 1900 - 1980

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.25

Human Generated Data

Title

Untitled (New York City)

People

Artist: Joseph Kaplan, American 1900 - 1980

Date

1930s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.25

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 69.3
Human 69.3
Outdoors 67.5
Person 64.9
Wood 56.3
Person 51.4

Clarifai
created on 2023-10-25

people 99.5
monochrome 99.1
portrait 97.2
street 96.9
wedding 96.8
girl 96.5
art 96.4
fence 95.7
group 95.3
vintage 94.6
group together 94.6
adult 94.3
analogue 94
child 93
woman 92.9
bike 92
shadow 90.8
man 90.4
music 90
model 89.4

Imagga
created on 2021-12-14

person 29.5
sexy 28.1
people 24.5
adult 22.7
black 22.6
dark 20.9
fashion 19.6
model 19.4
portrait 19.4
body 18.4
male 17.7
attractive 17.5
man 16.3
pretty 16.1
posing 16
face 14.2
musical instrument 14.2
sensual 13.6
human 13.5
women 13.4
style 13.3
music 13
brass 13
microphone 12.7
device 12.3
looking 12
lifestyle 11.6
wind instrument 11.2
elegant 11.1
hair 11.1
stage 11
makeup 11
lady 10.5
singer 10.4
hands 10.4
performer 10.2
skin 10.1
sensuality 10
party 9.5
men 9.4
sport 9.3
hand 9.1
make 9.1
pose 9.1
dress 9
one 8.9
group 8.9
musician 8.7
erotic 8.6
grunge 8.5
clothes 8.4
clothing 8.4
studio 8.4
slim 8.3
silhouette 8.3
guitar 8.2
fitness 8.1
couple 7.8
concert 7.8
wall 7.7
crowd 7.7
performance 7.7
power 7.5
elegance 7.5
training 7.4
fit 7.4
blond 7.3
danger 7.3
handsome 7.1
happiness 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

outdoor 96.2
concert 91.5
group 88.5
black 87.2
person 85.3
standing 81.3
clothing 66.5
white 65.3
black and white 58.9
old 57.2
posing 56.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 17-29
Gender Female, 55.4%
Calm 96.1%
Sad 2.5%
Angry 1%
Confused 0.1%
Surprised 0.1%
Disgusted 0.1%
Happy 0.1%
Fear 0%

AWS Rekognition

Age 21-33
Gender Female, 90.6%
Calm 66.9%
Fear 13%
Surprised 5.7%
Disgusted 4.8%
Sad 4.6%
Angry 3.1%
Confused 1.5%
Happy 0.5%

AWS Rekognition

Age 16-28
Gender Female, 64.8%
Calm 93.5%
Angry 2.9%
Happy 1.1%
Sad 0.9%
Disgusted 0.7%
Surprised 0.5%
Confused 0.2%
Fear 0.1%

AWS Rekognition

Age 22-34
Gender Female, 58.3%
Calm 48.4%
Angry 44.2%
Surprised 4.4%
Disgusted 1.9%
Sad 0.5%
Fear 0.3%
Confused 0.3%
Happy 0%

AWS Rekognition

Age 18-30
Gender Female, 73.7%
Calm 61.6%
Sad 35.4%
Angry 1.6%
Happy 0.5%
Fear 0.3%
Surprised 0.2%
Confused 0.2%
Disgusted 0.1%

AWS Rekognition

Age 32-48
Gender Male, 71.3%
Calm 59.5%
Sad 24.7%
Happy 10.7%
Angry 1.8%
Fear 1.5%
Confused 1.1%
Surprised 0.5%
Disgusted 0.2%

Microsoft Cognitive Services

Age 33
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 28
Gender Female

Feature analysis

Amazon

Person 69.3%

Categories

Imagga

interior objects 99.8%

Text analysis

Amazon

98
9
ANN
ures
ANB-BU
الله
NELON

Google

AnN ures an BU D98
AnN
an
BU
ures
D98