Human Generated Data

Title

Untitled (sharecropper family, near Little Rock, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1392

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (sharecropper family, near Little Rock, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.1392

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Face 99.9
Head 99.9
Photography 99.9
Portrait 99.9
Clothing 99.8
Hat 99.8
People 99.8
Firearm 99.4
Gun 99.4
Rifle 99.4
Weapon 99.4
Person 99.3
Person 99.1
Person 98.9
Boy 98.9
Child 98.9
Male 98.9
Person 98.8
Child 98.8
Female 98.8
Girl 98.8
Person 98.6
Male 98.6
Adult 98.6
Man 98.6
Person 97.9
Wood 97.6
Person 97.6
Male 97.6
Adult 97.6
Man 97.6
Door 77.1
Outdoors 74.7
Shirt 61.2
Baseball Cap 57.3
Cap 57.3
Architecture 57.2
Building 57.2
Shelter 57.2
Housing 56.9
Blouse 56.4
Nature 56.4
Countryside 56.4
Hut 56.2
Rural 56.2
Pants 55.8
Window 55.3
Footwear 55.3
Shoe 55.3

Clarifai
created on 2018-05-11

people 100
child 99.9
group 99.7
boy 97.9
adult 97.7
two 96.8
wear 96.3
son 95.8
three 94.8
family 94.6
sibling 93.7
several 93.5
offspring 93
home 92.6
man 92.2
five 92.1
portrait 92
group together 91.6
facial expression 91.2
four 90.8

Imagga
created on 2023-10-06

child 49.5
world 26.8
man 25.5
person 24.9
juvenile 23.2
people 22.3
male 20.8
family 18.7
classroom 18.2
school 16.8
adult 16.3
smiling 15.9
parent 15.7
mother 14.3
room 13.7
kin 13.6
happy 12.5
portrait 12.3
sitting 12
outdoors 11.9
smile 11.4
boy 11.3
men 11.2
old 11.1
20s 11
business 10.9
businessman 10.6
father 10.2
cheerful 9.7
home 9.6
happiness 9.4
culture 9.4
tradition 9.2
teacher 9.2
girls 9.1
vintage 9.1
dad 9
couple 8.7
education 8.7
ancient 8.6
casual 8.5
children 8.2
group 8.1
women 7.9
youth 7.7
student 7.6
grandfather 7.6
talking 7.6
holding 7.4
sibling 7.3
dress 7.2
black 7.2
looking 7.2
childhood 7.2
kitchen 7.2
interior 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 98.8
outdoor 92.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Female, 99%
Sad 95.8%
Angry 39.4%
Surprised 6.6%
Fear 6.1%
Calm 5.1%
Confused 3.5%
Disgusted 1.7%
Happy 0.4%

AWS Rekognition

Age 13-21
Gender Female, 100%
Sad 99.6%
Fear 13.9%
Calm 12.4%
Surprised 6.7%
Confused 2.4%
Disgusted 2%
Angry 1.2%
Happy 1%

AWS Rekognition

Age 16-22
Gender Female, 98.7%
Happy 76.5%
Calm 10.5%
Fear 7.1%
Surprised 6.9%
Sad 5.3%
Angry 1%
Confused 0.6%
Disgusted 0.5%

AWS Rekognition

Age 31-41
Gender Female, 99.9%
Sad 100%
Surprised 6.3%
Fear 6%
Confused 0.8%
Calm 0.3%
Disgusted 0.1%
Angry 0.1%
Happy 0.1%

AWS Rekognition

Age 6-16
Gender Female, 100%
Sad 90.5%
Fear 56.8%
Calm 9.7%
Surprised 6.6%
Confused 0.5%
Angry 0.5%
Disgusted 0.4%
Happy 0.3%

AWS Rekognition

Age 19-27
Gender Male, 99.7%
Confused 93.3%
Surprised 6.4%
Fear 6.4%
Sad 3.3%
Calm 1.1%
Angry 0.6%
Disgusted 0.2%
Happy 0.1%

Microsoft Cognitive Services

Age 35
Gender Male

Microsoft Cognitive Services

Age 37
Gender Male

Microsoft Cognitive Services

Age 56
Gender Male

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 9
Gender Male

Microsoft Cognitive Services

Age 37
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Boy 98.9%
Child 98.9%
Male 98.9%
Female 98.8%
Girl 98.8%
Adult 98.6%
Man 98.6%
Shoe 55.3%

Categories

Text analysis

Amazon

on thishase