Human Generated Data

Title

Untitled (sharecropper family, Little Rock, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2823

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (sharecropper family, Little Rock, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2823

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Boy 98.9
Child 98.9
Male 98.9
Person 98.9
Person 98.6
Architecture 98.6
Building 98.6
House 98.6
Housing 98.6
Porch 98.6
Chair 98.3
Furniture 98.3
Wood 98.2
Outdoors 95.2
Shelter 95.2
Firearm 94.4
Gun 94.4
Rifle 94.4
Weapon 94.4
Person 91.1
Bench 89.6
Face 84.2
Head 84.2
Countryside 74.2
Hut 74.2
Nature 74.2
Rural 74.2
Clothing 57.7
Hat 57.7
Cabin 57
Carpenter 56.8
Shack 56.7
Deck 56.6
Door 55

Clarifai
created on 2018-05-10

people 100
adult 98.7
group 98.5
group together 98.4
man 96.6
two 95.6
war 94
administration 93.8
one 93.6
vehicle 91.8
three 91
furniture 90.8
child 90.5
military 89.5
actor 89.2
woman 87
canine 84.3
four 82.9
weapon 82.6
leader 82.4

Imagga
created on 2023-10-06

sliding door 57
door 47.6
chair 37.5
movable barrier 34.2
seat 26.6
barrier 22.8
building 19.7
wood 18.3
rocking chair 17.3
window 15.9
furniture 14.9
house 13.4
device 12.9
outdoors 12.7
day 12.5
old 12.5
steel 12.4
bench 12.1
water 12
relaxation 11.7
architecture 11.7
people 11.7
obstruction 11.5
home 11.2
industry 11.1
winter 11.1
man 10.7
outdoor 10.7
metal 10.5
structure 10.4
cold 10.3
tree 10
machine 10
summer 9.6
support 9.5
relax 9.3
snow 9.3
male 9.2
leisure 9.1
park 9.1
adult 9
wooden 8.8
lifestyle 8.7
outside 8.6
construction 8.5
person 8.5
sky 8.3
relaxing 8.2
industrial 8.2
swing 8.2
handcart 8
light 8
work 7.8
black 7.8
glass 7.8
patio 7.7
attractive 7.7
loom 7.5
happy 7.5
one 7.5
vintage 7.4
landscape 7.4
shovel 7.4
equipment 7.3
dirty 7.2
wheeled vehicle 7.2
worker 7.1
women 7.1
trees 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 95.7
black 71.8
old 58.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 6-16
Gender Male, 95.8%
Sad 100%
Surprised 6.3%
Fear 6%
Calm 1.1%
Confused 0.7%
Angry 0.2%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 42-50
Gender Male, 100%
Calm 94.7%
Surprised 6.4%
Fear 6%
Sad 3.3%
Angry 0.8%
Confused 0.3%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 41-49
Gender Male, 88.8%
Sad 99.9%
Calm 9.8%
Surprised 6.7%
Fear 6.6%
Happy 6.5%
Confused 2.5%
Disgusted 0.8%
Angry 0.4%

AWS Rekognition

Age 23-33
Gender Female, 89.5%
Calm 62.8%
Happy 33.9%
Surprised 6.4%
Fear 5.9%
Sad 2.6%
Confused 0.8%
Disgusted 0.4%
Angry 0.3%

Microsoft Cognitive Services

Age 48
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Boy 98.9%
Child 98.9%
Male 98.9%
Person 98.9%
Chair 98.3%
Bench 89.6%

Categories

Imagga

interior objects 100%