Human Generated Data

Title

Untitled (Jean Pearson ironing)

Date

1949

People

Artist: W. Eugene Smith, American 1918 - 1978

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, P2001.61

Human Generated Data

Title

Untitled (Jean Pearson ironing)

People

Artist: W. Eugene Smith, American 1918 - 1978

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, P2001.61

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 98.8
Human 98.8
Room 97.2
Indoors 97.2
Person 95.2
Furniture 93.8
Dressing Room 93
Bedroom 92.7
Interior Design 80.2
Clothing 79.3
Apparel 79.3
Bed 65.1
Person 63
Housing 61.9
Building 61.9
Electronics 56.3
Dorm Room 56.3
Mirror 55.8
Living Room 55.8
Person 53.3

Clarifai
created on 2023-10-25

people 99.8
monochrome 98.1
adult 96.2
portrait 95.5
man 95.4
woman 94.8
one 94.5
model 94.1
two 93.5
room 92.9
indoors 91.3
street 86.8
analogue 84.5
music 84.1
girl 81.2
fashion 80.1
nude 79.9
mirror 79.3
studio 79.3
actor 78.5

Imagga
created on 2022-01-09

person 29.3
adult 26.6
male 24.5
man 24.2
black 23.5
people 21.7
world 19.6
fitness 18.1
electric chair 17.5
dumbbell 16.6
human 16.5
posing 16
body 16
one 15.7
weight 15.7
portrait 15.5
exercise 15.4
equipment 15.3
sport 14.9
model 14.8
instrument of execution 14.5
dark 14.2
sports equipment 13.9
cool 13.3
device 13.2
lifestyle 12.3
fashion 12.1
sexy 12
instrument 11.8
sensuality 11.8
muscular 11.5
athlete 11.3
wall 11.1
training 11.1
dress 10.8
face 10.7
happy 10.6
serious 10.5
couple 10.5
love 10.3
grunge 10.2
attractive 9.8
style 9.6
looking 9.6
concrete 9.6
action 9.3
clothing 9.3
elegance 9.2
urban 8.7
men 8.6
performer 8.6
dance 8.6
expression 8.5
casual 8.5
strength 8.4
punching bag 8.3
health 8.3
active 8.2
pose 8.2
dancer 8
skirt 7.8
jumping 7.7
hand 7.7
performance 7.7
room 7.6
strong 7.5
water 7.3
dirty 7.2
interior 7.1
modern 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

clothing 94.7
text 86.3
person 86.2
black and white 84.3
man 78.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 99.8%
Angry 74.9%
Calm 8.2%
Sad 7.7%
Surprised 3.7%
Confused 2.8%
Fear 1.3%
Disgusted 1%
Happy 0.3%

Microsoft Cognitive Services

Age 29
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Categories

Captions

Microsoft
created on 2022-01-09

a man holding a gun 25.5%
an old photo of a man 25.4%
a man standing in front of a tv 25.3%