Human Generated Data

Title

East Side, New York City

Date

1947

People

Artist: Jerry Liebling, American 1924 - 2011

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.37

Human Generated Data

Title

East Side, New York City

People

Artist: Jerry Liebling, American 1924 - 2011

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, by exchange, P2000.37

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.3
Human 99.3
Person 99
Person 98.6
Person 96.7
Person 93.6
Crowd 78.9
Clothing 76
Apparel 76
Face 69.1
Person 67
Art 64.8
Painting 64.8
People 63.8
Person 56.7
Musician 55.5
Musical Instrument 55.5

Clarifai
created on 2023-10-26

people 100
group 99.7
adult 99.6
woman 99
group together 98.9
many 98.8
man 97.7
child 97.4
administration 97.3
wear 95.5
war 94.5
leader 94.2
vehicle 93.6
several 93.1
portrait 91.1
boy 90.9
furniture 90
transportation system 89.8
elderly 87.3
three 86.3

Imagga
created on 2022-01-22

television 30.6
people 23.4
person 21.1
man 20.1
black 19.6
telecommunication system 19
adult 16.3
male 16.3
musical instrument 15.1
silhouette 14.1
night 13.3
sexy 12.8
dress 12.6
dark 12.5
portrait 12.3
couple 12.2
singer 11.8
performer 11.6
spectator 11.5
musician 11.3
love 11
kin 10.6
human 10.5
fun 10.5
body 10.4
style 10.4
art 10.1
model 10.1
happy 10
fashion 9.8
attractive 9.8
one 9.7
party 9.5
passion 9.4
clothing 9.4
business 9.1
pretty 9.1
dance 9.1
posing 8.9
group 8.9
lifestyle 8.7
window 8.5
hand 8.3
stringed instrument 8.3
sensual 8.2
stage 8.1
celebration 8
wind instrument 7.8
dancing 7.7
expression 7.7
old 7.7
lady 7.3
sensuality 7.3
romantic 7.1
women 7.1
businessman 7.1
happiness 7
modern 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.2
clothing 96.4
man 91.5
person 90.2
black and white 88.2
black 84.6
posing 66.6
people 65.1
old 64.8
white 64.8
group 59.1
vintage 27.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 56-64
Gender Male, 97.3%
Sad 63.1%
Calm 27.3%
Angry 4%
Confused 3.5%
Fear 0.8%
Surprised 0.5%
Happy 0.5%
Disgusted 0.4%

AWS Rekognition

Age 42-50
Gender Female, 98.2%
Happy 99.6%
Surprised 0.1%
Disgusted 0.1%
Angry 0.1%
Fear 0.1%
Confused 0%
Calm 0%
Sad 0%

AWS Rekognition

Age 72-82
Gender Female, 98.5%
Calm 90.7%
Happy 3.1%
Surprised 2%
Disgusted 1%
Confused 1%
Angry 0.9%
Sad 0.8%
Fear 0.6%

AWS Rekognition

Age 54-64
Gender Female, 100%
Calm 72.9%
Sad 9.8%
Angry 7.3%
Happy 4%
Confused 2.3%
Disgusted 1.9%
Surprised 1.2%
Fear 0.6%

AWS Rekognition

Age 43-51
Gender Female, 51.8%
Calm 97.2%
Surprised 0.9%
Fear 0.8%
Sad 0.5%
Confused 0.3%
Happy 0.2%
Disgusted 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 70
Gender Male

Microsoft Cognitive Services

Age 44
Gender Female

Microsoft Cognitive Services

Age 68
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Painting 64.8%

Categories

Imagga

paintings art 76.5%
people portraits 21.1%
food drinks 1.5%

Text analysis

Amazon

Pearl

Google

Pearl
Pearl