Human Generated Data

Title

Untitled (Kentucky or Tennessee?)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1158

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Kentucky or Tennessee?)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1158

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 100
Head 100
Photography 100
Portrait 100
Clothing 99.9
Hat 99.9
Cap 99.5
Person 99.5
Adult 99.5
Female 99.5
Woman 99.5
Fence 98.9
Happy 94
Smile 94
Coat 86.2
Outdoors 79.5
Nature 73.4
Snow 67.4
Winter 56.9
Scarf 56.8
Bonnet 56.5
Jacket 56.3
Glove 55.9
Furniture 55.2
City 55.2
Lady 55.1

Clarifai
created on 2018-05-11

people 99.8
one 99.6
adult 99.3
portrait 99.1
woman 98.5
wear 96
administration 91.8
actress 89.6
veil 89.5
winter 82.8
lid 81.8
coat 81.4
two 81.1
outerwear 78.8
child 78.6
music 77.7
recreation 76.2
leader 75.6
facial expression 75.2
outfit 73.1

Imagga
created on 2023-10-06

portrait 33
person 32.2
crossword puzzle 31.6
adult 27.8
face 25.6
puzzle 25.1
attractive 24.5
people 22.9
smile 22.1
happy 21.9
pretty 19.6
black 19.2
hair 19
model 17.9
game 17.2
sexy 16.9
lady 16.2
fashion 15.8
eyes 15.5
newspaper 15.4
expression 15.4
one 14.9
smiling 14.5
lifestyle 14.5
man 14.1
cute 13.6
work 13.4
lips 13
women 12.7
office 12.3
hand 12.2
business 12.1
student 11.8
brunette 11.3
male 11.3
looking 11.2
corporate 11.2
product 11.2
casual 11
professional 11
cheerful 10.6
human 10.5
body 10.4
youth 10.2
happiness 10.2
teenager 10
sensuality 10
window shade 9.8
building 9.8
modern 9.8
businessman 9.7
structure 9.5
creation 9.2
head 9.2
alone 9.1
businesswoman 9.1
child 9.1
worker 8.9
look 8.8
urban 8.7
serious 8.6
manager 8.4
joy 8.4
executive 8.3
style 8.2
protective covering 7.9
wall 7.9
window blind 7.9
sad 7.7
dark 7.5
teen 7.4
girls 7.3
gorgeous 7.2
covering 7.1
job 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

outdoor 93.2
person 89.6
white 72.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Male, 79.4%
Sad 91.4%
Confused 21.7%
Angry 19.3%
Surprised 6.8%
Fear 6.4%
Calm 6.2%
Disgusted 6%
Happy 1.1%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Adult 99.5%
Female 99.5%
Woman 99.5%