Human Generated Data

Title

Children of a destitute Ozark mountaineer, Arkansas

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3059

Human Generated Data

Title

Children of a destitute Ozark mountaineer, Arkansas

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3059

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.1
Person 99
Person 97.6
Person 91.2
Face 82.6
Photo 78.7
Photography 78.7
Portrait 71.3
People 65.9
Military Uniform 62.3
Military 62.3
Paintball 62.3
Baby 56.4
Hunting 55.2
Person 53.8

Clarifai
created on 2023-10-15

people 99.9
child 99.3
war 99
soldier 98.6
portrait 98.2
military 98
adult 97.7
group together 97.5
two 97
gun 96.7
uniform 96.6
group 96.4
man 96.2
boy 96.2
army 96.2
wear 95.8
three 94.5
documentary 94.2
retro 93.1
son 92.8

Imagga
created on 2021-12-15

military uniform 69.8
uniform 67.9
clothing 46.3
rifle 31.9
gun 31.8
weapon 29.6
consumer goods 28.1
covering 27.9
man 26.9
military 24.1
soldier 23.5
war 22.5
people 22.3
adult 20.7
male 19.9
person 19.6
camouflage 18.6
firearm 17.5
danger 17.3
child 16.1
outdoors 15.7
army 15.6
portrait 15.5
protection 15.5
outdoor 14.5
commodity 13.9
gunnery 13.7
mask 13.6
love 13.4
weaponry 13.2
fun 12.7
statue 12.5
family 12.4
holding 12.4
happiness 11.8
bazooka 11.7
recreation 11.7
sport 11.6
world 11.6
boy 11.3
happy 11.3
armament 11
tank 10.9
black 10.2
action 10.2
day 10.2
vehicle 10.1
kin 9.8
battle 9.8
target 9.7
launcher 9.4
two 9.3
training 9.2
travel 9.2
leisure 9.1
mother 9.1
old 9.1
game 8.9
defense 8.8
death 8.7
lifestyle 8.7
military vehicle 8.6
attractive 8.4
playing 8.2
girls 8.2
childhood 8.1
sexy 8
smiling 8
helmet 7.9
warfare 7.9
cute 7.9
together 7.9
pretty 7.7
bride 7.7
sky 7.7
casual 7.6
cannon 7.6
fashion 7.5
tracked vehicle 7.5
joy 7.5
equipment 7.5
park 7.4
sports 7.4
competition 7.3
industrial 7.3
activity 7.2
face 7.1
summer 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

person 99.7
toddler 97
clothing 97
baby 96.2
text 95.9
child 94
boy 92.4
human face 90.9
black and white 66.2
smile 61.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 2-8
Gender Female, 80.7%
Calm 85.8%
Sad 13.7%
Angry 0.2%
Confused 0.1%
Happy 0.1%
Fear 0.1%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 1-7
Gender Male, 86.1%
Disgusted 74.1%
Sad 13.5%
Calm 7.5%
Happy 2.8%
Confused 0.7%
Angry 0.6%
Fear 0.5%
Surprised 0.3%

Microsoft Cognitive Services

Age 3
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%