Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.967

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.967

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Restaurant 99.7
Food 99.7
Meal 99.7
Adult 99.3
Male 99.3
Man 99.3
Person 99.3
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Cafeteria 98.4
Dish 95.6
Adult 94.8
Person 94.8
Female 94.8
Woman 94.8
Person 94.7
Face 93
Head 93
Pottery 87.3
Person 74.9
People 69.7
Art 60.6
Painting 60.6
Dinner 57.4
Food Court 56.8
Cutlery 56.7
Tabletop 56.7
Home Decor 56.1
Linen 56.1
Spoon 55.8
Porcelain 55.6

Clarifai
created on 2018-05-11

people 99.9
group 99.4
group together 98.6
adult 97.4
many 96.1
man 95.8
war 95.5
military 94
several 90.9
child 89.8
woman 89.8
furniture 89.7
soldier 89.5
room 88.7
five 88.6
four 87.4
administration 85.6
sit 84.5
recreation 82.1
uniform 79.7

Imagga
created on 2023-10-06

uniform 64.7
military uniform 57.1
clothing 42.6
man 38.3
male 31.2
person 29.6
covering 25
engineer 24.4
consumer goods 23.9
people 19
adult 18.3
work 18
helmet 17.6
worker 16.9
men 16.3
equipment 16.1
home 15.9
gun 15.8
hat 15.8
happy 15.7
job 15
soldier 14.7
war 14.5
military 14.5
working 14.1
occupation 13.7
industry 13.6
smile 13.5
weapon 12
safety 12
protection 11.8
commodity 11.2
scholar 11.2
smiling 10.8
to 10.6
repair 10.5
together 10.5
couple 10.4
outdoors 10.4
portrait 10.3
camouflage 10
industrial 10
leisure 10
professional 9.8
old 9.7
mask 9.7
indoors 9.7
guy 9.4
construction 9.4
lifestyle 9.4
two 9.3
vehicle 9.1
hand 9.1
builder 9.1
danger 9.1
fun 9
activity 9
intellectual 8.9
handsome 8.9
hardhat 8.8
army 8.8
collar 8.6
site 8.4
attractive 8.4
paint 8.1
active 8.1
looking 8
business 7.9
boy 7.8
contractor 7.8
sitting 7.7
house 7.5
room 7.5
sport 7.4
recreation 7.2
game 7.1
family 7.1
face 7.1
interior 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
wall 95.9
indoor 86.6
people 79.8
group 74.1
meal 44.9
dinner 36.3
family 23.6
older 20.6

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 23-31
Gender Female, 91.4%
Happy 79.4%
Calm 10.9%
Fear 7.1%
Surprised 6.9%
Confused 3.3%
Sad 2.5%
Disgusted 0.6%
Angry 0.5%

AWS Rekognition

Age 23-31
Gender Male, 99.5%
Calm 66.2%
Sad 12%
Confused 10.2%
Fear 7.3%
Surprised 7.1%
Disgusted 2.3%
Happy 2.1%
Angry 1.3%

AWS Rekognition

Age 45-51
Gender Male, 100%
Confused 49.9%
Calm 41.3%
Surprised 6.8%
Fear 6.4%
Angry 3%
Sad 2.7%
Disgusted 1.2%
Happy 0.7%

Microsoft Cognitive Services

Age 65
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.3%
Male 99.3%
Man 99.3%
Person 99.3%
Female 94.8%
Woman 94.8%

Categories