Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.986

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.986

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Garden 100
Nature 100
Outdoors 100
Adult 99.5
Male 99.5
Man 99.5
Person 99.5
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Person 99.1
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99
Male 99
Man 99
Person 99
Animal 98.9
Bird 98.9
Chicken 98.9
Fowl 98.9
Poultry 98.9
Gardening 97.6
Face 87.9
Head 87.9
Gardener 58
Backyard 57.5
Yard 57.5
Canine 57.1
Mammal 57.1
Clothing 56.7
Coat 56.7
Pet 55.8

Clarifai
created on 2018-05-11

people 100
group 99
child 98.2
adult 98
group together 97.7
several 95.7
man 95.5
woman 94.4
administration 93
boy 92.4
war 90.6
many 89
three 88.6
four 87.3
five 86.4
recreation 86.3
wear 86.1
military 84.4
outfit 84.1
vehicle 83.3

Imagga
created on 2023-10-07

man 29.5
male 27.3
people 25.1
person 21.6
men 18.9
adult 18.5
couple 18.3
kin 16.7
old 15.3
barbershop 14.6
happy 14.4
family 14.2
love 14.2
happiness 14.1
senior 13.1
home 12.8
portrait 12.3
world 12.1
together 11.4
shop 11.3
human 11.2
groom 10.5
bride 10.5
father 10.3
day 10.2
city 10
face 9.9
child 9.8
elderly 9.6
husband 9.5
room 9.5
women 9.5
nurse 9.3
two 9.3
wedding 9.2
holding 9.1
care 9
dress 9
mercantile establishment 9
outdoors 9
hospital 8.9
uniform 8.8
hair 8.7
marriage 8.5
park 8.2
clothing 8.2
girls 8.2
aged 8.1
religion 8.1
patient 8
mother 8
smiling 8
medical 7.9
sitting 7.7
retirement 7.7
life 7.6
grandfather 7.6
talking 7.6
hand 7.6
wife 7.6
traditional 7.5
lady 7.3
business 7.3
color 7.2
lifestyle 7.2
looking 7.2
to 7.1
gun 7.1
mask 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.1
outdoor 97.4
posing 83
people 81.5
black 81.4
standing 80.4
group 67.6
white 64.8
old 62.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-40
Gender Male, 99.8%
Disgusted 90.1%
Surprised 6.7%
Fear 6%
Confused 4.5%
Sad 2.5%
Calm 1.9%
Angry 1.1%
Happy 0.1%

AWS Rekognition

Age 6-12
Gender Male, 73%
Calm 80.5%
Sad 18.1%
Surprised 6.5%
Fear 6.1%
Confused 1%
Happy 0.6%
Angry 0.4%
Disgusted 0.4%

AWS Rekognition

Age 19-27
Gender Male, 100%
Calm 67.9%
Confused 20.1%
Surprised 6.9%
Fear 6%
Happy 5.9%
Sad 3.2%
Angry 0.7%
Disgusted 0.7%

Microsoft Cognitive Services

Age 48
Gender Female

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.5%
Male 99.5%
Man 99.5%
Person 99.5%
Chicken 98.9%