Human Generated Data

Title

Untitled (Ozarks, Arkansas)

Date

October 1935

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3041

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (Ozarks, Arkansas)

People

Artist: Ben Shahn, American 1898 - 1969

Date

October 1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3041

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Face 100
Head 100
Photography 100
Portrait 100
Clothing 100
Coat 100
Jacket 100
Person 99.3
Boy 99.3
Child 99.3
Male 99.3
Person 98.8
Male 98.8
Adult 98.8
Man 98.8
Person 96.4
Adult 96.4
Female 96.4
Woman 96.4
Person 91.5
Baby 91.5
Body Part 89.7
Finger 89.7
Hand 89.7
Cap 89.2
Machine 86.9
Wheel 86.9
Transportation 79.6
Vehicle 79.6
People 77.9
Hat 57.7
Baseball Cap 57.6
Outdoors 57.5
Spoke 56.4
Driving 56.2
Worker 55.6

Clarifai
created on 2018-05-10

people 99.8
group together 99.1
group 97.6
adult 97.5
man 96.1
portrait 96
vehicle 95.5
child 95.1
military 94
wear 93.9
four 93.1
several 92.2
three 92.1
war 91.3
woman 91.3
veil 90
recreation 89.7
two 89.4
five 88.4
outfit 87.4

Imagga
created on 2023-10-05

bench 32.5
people 30.1
happy 30.1
park 29.1
park bench 28.8
outdoors 28.3
family 26.7
child 24.5
together 24.5
kin 24.4
man 24.2
outdoor 22.2
smile 22.1
portrait 22
outside 21.4
seat 21
fun 20.9
male 20.6
person 20.5
smiling 18.8
youth 18.7
love 18.1
grass 17.4
boy 17.4
mother 17.1
kid 16.8
summer 16.7
couple 16.5
happiness 16.4
lifestyle 15.2
adult 14.3
joy 14.2
parent 14.1
sibling 13.8
women 13.4
childhood 13.4
autumn 13.2
stretcher 12.6
statue 12.6
leisure 12.4
senior 12.2
sitting 12
son 11.7
father 11.5
furniture 11.5
daughter 11.4
laughing 11.3
relationship 11.2
old 11.1
conveyance 10.7
togetherness 10.4
men 10.3
day 10.2
season 10.1
litter 10.1
garden 10.1
cute 10
resort area 10
face 9.9
meadow 9.9
cheerful 9.7
little 9.7
grandfather 9.6
elderly 9.6
friends 9.4
winter 9.4
sky 8.9
group 8.9
cold 8.6
casual 8.5
relax 8.4
attractive 8.4
mature 8.4
teen 8.3
countryside 8.2
recreation 8.1
sculpture 8
hair 7.9
holiday 7.9
seasonal 7.9
play 7.7
area 7.7
pretty 7.7
enjoying 7.6
enjoy 7.5
friendship 7.5
relaxing 7.3
fall 7.2
active 7.2
religion 7.2

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

outdoor 99.9
person 99.6
old 52.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-51
Gender Male, 99.8%
Calm 99.4%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Angry 0.2%
Confused 0.1%
Happy 0%
Disgusted 0%

AWS Rekognition

Age 2-8
Gender Male, 100%
Sad 84.1%
Confused 30.7%
Fear 13%
Calm 9.4%
Surprised 7.4%
Angry 3.3%
Disgusted 2.5%
Happy 0.5%

AWS Rekognition

Age 28-38
Gender Male, 77.1%
Angry 66.7%
Confused 30.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Calm 1.9%
Disgusted 0.1%
Happy 0.1%

AWS Rekognition

Age 0-4
Gender Male, 99.8%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Confused 0%
Angry 0%
Disgusted 0%
Happy 0%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 60
Gender Male

Microsoft Cognitive Services

Age 5
Gender Female

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%
Boy 99.3%
Child 99.3%
Male 99.3%
Adult 98.8%
Man 98.8%
Female 96.4%
Woman 96.4%
Baby 91.5%
Wheel 86.9%
Hat 57.7%

Categories

Imagga

paintings art 96.6%
people portraits 3.1%