Human Generated Data

Title

Untitled (Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1097

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1097

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Boy 99.4
Child 99.4
Male 99.4
Person 99.4
Male 99
Person 99
Adult 99
Man 99
Male 98.8
Person 98.8
Adult 98.8
Man 98.8
Pants 98.1
Sun Hat 96.9
Vest 96
Coat 93.1
Face 90.4
Head 90.4
Hat 87
Photography 57.4
Portrait 57.4
Jacket 56.1
Outdoors 55.9
Cowboy Hat 55.9
Wood 55.8
Lady 55.8
Cap 55.7
Shorts 55.5

Clarifai
created on 2018-05-11

people 100
group 98.8
adult 98.7
group together 97.8
two 97.4
man 97
three 96.7
woman 94.8
actor 94.3
administration 94
four 92.2
military 92.2
wear 91.4
several 91.3
portrait 91.2
leader 88.9
war 87.7
outfit 87.1
five 87
uniform 85

Imagga
created on 2023-10-07

man 37.6
person 35.6
male 29.8
people 27.9
old 21.6
hat 18.6
adult 18.4
portrait 16.2
black 15
men 14.6
scholar 14.6
clothing 11.8
world 11.7
intellectual 11.6
uniform 11.3
happy 11.3
looking 11.2
religion 10.7
vintage 10.7
smile 10.7
face 10.6
hand 10.6
statue 10.6
religious 10.3
worker 9.9
art 9.8
historical 9.4
holding 9.1
guy 9
professional 8.9
family 8.9
mask 8.9
building 8.8
smiling 8.7
antique 8.7
ancient 8.6
work 8.6
senior 8.4
city 8.3
washboard 8.3
fashion 8.3
group 8.1
history 8
business 7.9
room 7.8
standing 7.8
device 7.8
god 7.6
casual 7.6
catholic 7.5
equipment 7.5
famous 7.4
lady 7.3
dress 7.2
working 7.1
patient 7
architecture 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.3
outdoor 96.4
old 91.1
standing 75.1
white 66.8
group 61.9
people 56
posing 47.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-33
Gender Male, 100%
Calm 56.5%
Angry 16.6%
Confused 10.7%
Surprised 8.4%
Fear 6.3%
Sad 5.7%
Happy 2.3%
Disgusted 1.9%

AWS Rekognition

Age 45-51
Gender Male, 87.9%
Calm 58.6%
Confused 31.5%
Surprised 7.6%
Fear 6%
Sad 3.7%
Angry 2.1%
Disgusted 0.7%
Happy 0.5%

AWS Rekognition

Age 41-49
Gender Male, 100%
Calm 99.8%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.1%
Confused 0.1%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 58
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Feature analysis

Amazon

Boy 99.4%
Child 99.4%
Male 99.4%
Person 99.4%
Adult 99%
Man 99%
Coat 93.1%
Hat 87%

Categories

Imagga

paintings art 98.1%
people portraits 1.7%