Human Generated Data

Title

Political forum before dinner during wheat harvest, central Ohio

Date

1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2974

Human Generated Data

Title

Political forum before dinner during wheat harvest, central Ohio

People

Artist: Ben Shahn, American 1898 - 1969

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Person 99.4
Human 99.4
Person 99.3
Person 99.2
Person 98.9
Person 98.6
Person 98.1
Person 98
Person 97.4
Person 95.3
Building 93.7
Countryside 93.7
Shelter 93.7
Nature 93.7
Outdoors 93.7
Rural 93.7
Meal 88.2
Food 88.2
Person 87.2
Person 86.9
People 75.8
Military 73.5
Military Uniform 72.6
Person 70.5
Clothing 69.1
Apparel 69.1
Hut 66.8
Camping 66.7
Armored 66.6
Army 66.6
Leisure Activities 63.6
Picnic 60.5
Vacation 60.5
Housing 60.3
Shack 60
Bunker 56.5

Imagga
created on 2021-12-15

cemetery 26.1
child 20.3
park 19.8
kin 19
tree 18.8
grass 18.2
outdoors 18.2
forest 16.5
gravestone 16.3
old 16
stone 16
autumn 15.8
memorial 15.6
people 15.6
fall 15.4
trees 15.1
man 14.1
outdoor 13.8
outside 12.8
landscape 11.9
male 11.7
history 11.6
person 11.1
structure 10.5
scene 10.4
field 10
culture 9.4
building 9.3
travel 9.2
adult 9.1
summer 9
religion 9
rural 8.8
mother 8.7
yellow 8.6
architecture 8.6
pretty 8.4
sky 8.3
environment 8.2
father 8.2
smiling 8
leaves 7.9
spring 7.8
boy 7.8
path 7.6
horizontal 7.5
wood 7.5
famous 7.4
garden 7.3
countryside 7.3
danger 7.3
sun 7.2
morning 7.2
color 7.2
holiday 7.2
portrait 7.1
family 7.1
day 7.1
colors 7.1
country 7
agriculture 7

Google
created on 2021-12-15

Plant 89.5
Tree 85.7
Style 83.8
Black-and-white 81.9
Adaptation 79.3
Tints and shades 77.2
Motor vehicle 76.8
Monochrome photography 75.5
Monochrome 73.2
Vintage clothing 72.6
Art 69.7
Event 67.4
Room 67.4
Sitting 66.4
Grass 65.1
Classic 65
History 64.9
Suit 63.6
Photo caption 61.4
Yard 51.7

Microsoft
created on 2021-12-15

outdoor 89.6
person 88.2
clothing 79.1
tree 77.6
man 75.8
black and white 61.7
footwear 56.1
posing 44.7
crowd 1.3

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 96.7%
Calm 84.1%
Happy 4.8%
Sad 3.9%
Disgusted 2.6%
Angry 2.1%
Confused 1.4%
Surprised 0.8%
Fear 0.3%

AWS Rekognition

Age 23-35
Gender Female, 74.8%
Calm 86.7%
Happy 8.4%
Angry 1.7%
Sad 1.5%
Surprised 0.5%
Disgusted 0.5%
Fear 0.5%
Confused 0.2%

AWS Rekognition

Age 25-39
Gender Male, 68.5%
Happy 93.5%
Calm 3.1%
Sad 2.3%
Fear 0.5%
Angry 0.3%
Surprised 0.2%
Confused 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-35
Gender Male, 91.5%
Calm 83.8%
Happy 9.7%
Sad 1.9%
Angry 1.8%
Confused 1.5%
Disgusted 0.5%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 22-34
Gender Male, 89.4%
Happy 68.5%
Calm 20.4%
Sad 4.1%
Angry 2.6%
Surprised 1.7%
Disgusted 1.1%
Fear 0.9%
Confused 0.7%

AWS Rekognition

Age 10-20
Gender Male, 50.2%
Happy 41.1%
Calm 35.6%
Sad 10.3%
Surprised 3.8%
Fear 3.2%
Angry 2.8%
Confused 2.5%
Disgusted 0.6%

AWS Rekognition

Age 38-56
Gender Female, 80.5%
Calm 54.4%
Sad 15.6%
Confused 8.1%
Fear 6.6%
Surprised 5.9%
Happy 5.6%
Angry 2.8%
Disgusted 1%

AWS Rekognition

Age 47-65
Gender Male, 82.7%
Calm 41.7%
Confused 19.1%
Sad 14.7%
Angry 13%
Disgusted 6.7%
Surprised 2.4%
Fear 1.4%
Happy 1%

AWS Rekognition

Age 53-71
Gender Male, 76.4%
Calm 97.2%
Happy 2.1%
Disgusted 0.3%
Sad 0.2%
Angry 0.1%
Surprised 0.1%
Confused 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a group of people posing for a photo 82.7%
a group of people posing for a picture 82.6%
a group of people posing for the camera 82.5%