Human Generated Data

Title

Untitled (sharecropper family, near Little Rock, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2491

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (sharecropper family, near Little Rock, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2491

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Clothing 100
Sitting 99.8
Person 99.4
Adult 99.4
Male 99.4
Man 99.4
Sun Hat 99.4
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.2
Male 99.2
Boy 99.2
Child 99.2
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 97.4
Baby 97.4
Face 97
Head 97
Person 96.5
Person 93
Adult 93
Male 93
Man 93
Footwear 91.7
Shoe 91.7
Shoe 91.1
Photography 84.9
Hat 84.8
Portrait 82.3
Reading 82
Shoe 81.9
Jeans 69.4
Pants 69.4
Shoe 64.1
Furniture 62.9
Shoe 60.3
Shoe 59.9
Coat 56.9
Cap 56.9
Bench 55.7
Couch 55.6

Clarifai
created on 2018-05-10

people 100
group 99.7
adult 98.9
group together 98
several 97.9
man 97.8
four 97
three 96.4
two 96.4
administration 96.2
wear 95.3
five 95.3
military 94.3
woman 94.1
sit 93
leader 92.7
many 92.4
veil 92.3
lid 90.1
actor 88.6

Imagga
created on 2023-10-05

electric chair 36.4
device 31.9
instrument of execution 29.4
barbershop 28.3
man 28.2
shop 24.8
male 24.1
people 22.9
instrument 22.8
person 20.4
mercantile establishment 19.6
washboard 18.7
adult 17.7
portrait 14.9
men 13.7
place of business 13.1
black 12.6
fashion 11.3
travel 11.3
old 11.1
chair 11.1
art 11
musical instrument 10.7
hat 10.7
worker 10.5
mask 10.3
work 10.3
sculpture 10.3
smile 10
sexy 9.6
sitting 9.4
happy 9.4
two 9.3
face 9.2
attractive 9.1
dress 9
religion 9
looking 8.8
statue 8.6
human 8.2
women 7.9
couple 7.8
happiness 7.8
culture 7.7
hand 7.6
city 7.5
one 7.5
vintage 7.4
tourism 7.4
wind instrument 7.4
clothing 7.4
emotion 7.4
uniform 7.3
business 7.3
protection 7.3
industrial 7.3
smiling 7.2
history 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.8
outdoor 97.3
man 95.2
sitting 94.5
people 57.3
old 54.5
posing 39.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-12
Gender Female, 99.9%
Sad 100%
Calm 11.7%
Surprised 6.3%
Fear 5.9%
Angry 0.1%
Confused 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 27-37
Gender Female, 99.8%
Sad 100%
Surprised 6.4%
Fear 6%
Calm 6%
Confused 0.5%
Disgusted 0.3%
Happy 0.3%
Angry 0.3%

AWS Rekognition

Age 24-34
Gender Male, 100%
Calm 88.9%
Surprised 6.7%
Sad 6.3%
Fear 6%
Angry 0.9%
Confused 0.8%
Disgusted 0.4%
Happy 0.1%

AWS Rekognition

Age 6-16
Gender Male, 96.6%
Sad 99.7%
Calm 30.9%
Surprised 6.3%
Fear 6%
Confused 0.3%
Angry 0.1%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 23-31
Gender Female, 98.9%
Confused 72.7%
Surprised 9%
Calm 8.8%
Fear 8.6%
Angry 3.8%
Sad 3.2%
Happy 0.6%
Disgusted 0.5%

Microsoft Cognitive Services

Age 7
Gender Male

Microsoft Cognitive Services

Age 59
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%
Adult 99.4%
Male 99.4%
Man 99.4%
Boy 99.2%
Child 99.2%
Baby 97.4%
Shoe 91.7%
Hat 84.8%
Jeans 69.4%

Categories

Imagga

paintings art 94.3%
people portraits 4.9%