Human Generated Data

Title

Mexico

Date

1941, printed later

People

Artist: Helen Levitt, American 1913-2009

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, P2002.1.5

Copyright

© Estate of Helen Levitt

Human Generated Data

Title

Mexico

People

Artist: Helen Levitt, American 1913-2009

Date

1941, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Richard and Ronay Menschel Fund for the Acquisition of Photographs, P2002.1.5

Copyright

© Estate of Helen Levitt

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.2
Human 99.2
Clothing 99.2
Apparel 99.2
Person 99
Hat 95.1
Hand 94.7
People 78.5
Holding Hands 68.9
Face 68
Sun Hat 59.2

Clarifai
created on 2023-10-25

people 100
two 99.4
portrait 99.3
adult 99
three 98.5
man 97.6
child 97.4
veil 96.5
woman 96.1
street 95.4
group 94.4
lid 94
wear 93.9
four 93.7
documentary 92.7
group together 92
son 91.3
offspring 90.6
military 88.4
monochrome 88.4

Imagga
created on 2022-01-08

person 22.6
man 17.5
people 17.3
adult 16.2
human 15.7
wall 15.5
portrait 15.5
male 15
cowboy hat 14.3
model 14
old 13.9
hat 13.2
child 13
black 12
one 11.9
attractive 11.9
city 11.6
fashion 11.3
looking 11.2
clothing 11.1
sexy 10.4
body 10.4
building 10.4
ancient 10.4
parent 10.2
grunge 10.2
world 10.1
lifestyle 10.1
happy 10
dirty 9.9
outdoor 9.9
pretty 9.8
lady 9.7
urban 9.6
hair 9.5
headdress 9.4
dark 9.2
vintage 9.1
danger 9.1
juvenile 8.7
stone 8.4
girls 8.2
dress 8.1
snow 8.1
posing 8
happiness 7.8
mother 7.8
statue 7.7
serious 7.6
skin 7.6
females 7.6
relaxation 7.5
street 7.4
water 7.3
love 7.1
face 7.1

Google
created on 2022-01-08

Hat 91.5
Gesture 85.3
Dress 84.2
Adaptation 79.3
Monochrome photography 73.8
Monochrome 72.4
Vintage clothing 71.9
Room 65.7
History 63.1
Event 62.3
Collar 58.1
Art 55.4
Family 51.7
Photographic paper 51.3
Classic 50.2

Microsoft
created on 2022-01-08

person 98.4
wall 97.9
clothing 97.3
text 96.3
human face 91.2
woman 79
fashion accessory 74.8
black and white 73.8
dress 69.8
smile 53.4
hat 51.1
old 43.5
posing 43

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 37-45
Gender Female, 100%
Calm 98.7%
Confused 0.6%
Sad 0.2%
Angry 0.2%
Disgusted 0.1%
Happy 0.1%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 42-50
Gender Male, 99.4%
Happy 86.7%
Calm 4.5%
Disgusted 3.4%
Angry 2.6%
Surprised 0.8%
Fear 0.8%
Sad 0.7%
Confused 0.6%

Microsoft Cognitive Services

Age 34
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%
Hat 95.1%

Categories

Imagga

paintings art 98.8%

Text analysis

Amazon

-
и