Human Generated Data

Title

Family of rehabilitation client, Boone County, Arkansas

Date

1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3061

Human Generated Data

Title

Family of rehabilitation client, Boone County, Arkansas

People

Artist: Ben Shahn, American 1898 - 1969

Date

1935

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Nature 99.7
Human 99.5
Person 99.5
Outdoors 99.5
Person 99.1
Building 98.4
Housing 97.4
Countryside 96.9
Hut 95.1
Rural 95.1
Shack 94
Shelter 92.2
House 90.9
Clothing 86.5
Apparel 86.5
Cabin 82.5
Female 62.1
Woman 55.7

Imagga
created on 2021-12-15

roof 32.6
thatch 26.7
protective covering 21.8
roofing 20.7
building 20.2
brick 19.9
structure 18.6
old 16.7
adult 16.2
material 15.9
house 15.9
architecture 15.9
attractive 14.7
covering 14.6
person 14
people 13.9
wall 13.3
black 13.2
window 13.1
equipment 12.9
sexy 12.8
man 11.4
fashion 11.3
human 11.2
building material 11.2
body 11.2
portrait 11
sensuality 10.9
model 10.9
lifestyle 10.8
sky 10.8
dress 10.8
shelter 10.7
pretty 10.5
outdoors 10.4
home 10.4
hair 10.3
happy 9.4
umbrella 9.2
face 9.2
love 8.7
hut 8.4
relaxation 8.4
world 8.3
city 8.3
alone 8.2
one 8.2
lady 8.1
life 8
posing 8
greenhouse 7.9
urban 7.9
tile 7.5
tourism 7.4
smiling 7.2
women 7.1
male 7.1

Google
created on 2021-12-15

Smile 89.6
Black 89.6
Hat 88.7
Flash photography 88.3
Dress 86.6
Sunglasses 85.4
Gesture 85.3
Style 83.9
Wood 83.8
Black-and-white 83.1
Happy 82.9
Headgear 82.1
Adaptation 79.2
People 77.6
Tints and shades 77.3
Monochrome photography 73.9
Vintage clothing 72.2
Landscape 70.2
Event 70
Child 69.8

Microsoft
created on 2021-12-15

building 99
outdoor 97.2
clothing 94.3
person 90.7
smile 87.6
woman 84.7
dress 78.1
human face 77.3

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-37
Gender Male, 95.5%
Sad 72.4%
Calm 12%
Confused 6.4%
Fear 5%
Surprised 2.1%
Angry 1.6%
Disgusted 0.3%
Happy 0.1%

AWS Rekognition

Age 13-25
Gender Female, 94%
Calm 90.7%
Confused 4%
Sad 3.9%
Happy 0.6%
Surprised 0.4%
Angry 0.4%
Disgusted 0.1%
Fear 0.1%

AWS Rekognition

Age 3-11
Gender Female, 77.2%
Calm 62.7%
Sad 12.8%
Surprised 9.8%
Happy 7.4%
Angry 3.7%
Fear 1.6%
Disgusted 1%
Confused 0.9%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a woman standing in front of a building 86.5%
a woman sitting on a bench in front of a building 68.9%
a woman sitting in front of a building 68.8%