Human Generated Data

Title

Untitled (sharecropper family, near Little Rock, Arkansas)

Date

October 1935, printed later

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3413

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (sharecropper family, near Little Rock, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Museum Acquisition, P1970.3413

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Clothing 100
Sun Hat 99.7
Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 99
Male 99
Man 99
Person 99
Reading 98.4
Person 97.1
Face 96.6
Head 96.6
Person 95.6
Baby 95.6
Sitting 94.9
Adult 91.5
Male 91.5
Man 91.5
Person 91.5
Photography 91
Portrait 91
Footwear 90.3
Shoe 90.3
Shoe 88.5
Shoe 87.8
Person 82.6
Shoe 80.4
Hat 77.2
Shoe 71.1
Shoe 67.9
Cap 63
Furniture 57.5
Couch 56.4
Baseball Cap 56.3
Outdoors 56.2
Shirt 56
Coat 55.6
Architecture 55.3
Building 55.3
Hospital 55.3
Pants 55.2
Indoors 55.1
Living Room 55.1
Room 55.1

Clarifai
created on 2018-05-10

people 100
group 99.8
adult 98.3
group together 98.3
man 98
several 97.8
four 97.7
two 97.5
three 97.2
administration 95.4
veil 95.1
five 93.9
actor 93.5
leader 93
wear 93
lid 91.8
woman 89.9
portrait 87.3
many 86.8
furniture 86.7

Imagga
created on 2023-10-07

musical instrument 43.1
wind instrument 34.9
concertina 32.7
free-reed instrument 26.4
accordion 23.2
man 20.1
people 19.5
keyboard instrument 18.6
device 17.7
statue 17.5
men 17.2
male 17.1
person 17
religion 16.1
monument 15.9
sculpture 14.6
old 14.6
kin 14.5
adult 14.4
art 13.8
portrait 13.6
washboard 12.7
barbershop 12.1
architecture 11.7
history 11.6
shop 11.4
religious 11.2
dress 10.8
traditional 10.8
couple 10.4
ancient 10.4
culture 10.2
black 10.2
city 10
chair 9.8
historical 9.4
face 9.2
historic 9.2
travel 9.1
tourism 9.1
mercantile establishment 9.1
world 9.1
look 8.8
mask 8.6
costume 8.6
business 8.5
fashion 8.3
worker 8.2
new 8.1
women 7.9
smile 7.8
prayer 7.7
stone 7.6
human 7.5
vintage 7.4
clothing 7.4
tradition 7.4
girls 7.3
decoration 7.2
building 7.1
romantic 7.1
family 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.7
man 98.8
outdoor 93.5
old 86.4
black 79.2
white 75.6
posing 74.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 4-12
Gender Male, 59.9%
Calm 65.9%
Sad 65.4%
Surprised 6.4%
Fear 6.3%
Confused 0.7%
Angry 0.5%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 26-36
Gender Female, 100%
Sad 100%
Calm 14.8%
Surprised 6.5%
Fear 6.3%
Disgusted 1%
Angry 0.9%
Confused 0.7%
Happy 0.5%

AWS Rekognition

Age 21-29
Gender Male, 99.7%
Calm 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0.1%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 10-18
Gender Male, 59.9%
Fear 72.3%
Sad 31.4%
Surprised 12.1%
Calm 6.2%
Confused 5.6%
Disgusted 1.2%
Angry 1.1%
Happy 0.5%

AWS Rekognition

Age 1-7
Gender Male, 82.3%
Confused 46%
Calm 33%
Surprised 8.7%
Sad 8.3%
Fear 6.2%
Disgusted 2.8%
Angry 2.8%
Happy 0.5%

Microsoft Cognitive Services

Age 7
Gender Male

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 46
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%
Baby 95.6%
Shoe 90.3%
Hat 77.2%

Categories