Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3045

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3045

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Dining Table 100
Furniture 100
Table 100
Architecture 100
Building 100
Dining Room 100
Indoors 100
Room 100
Restaurant 99.8
Food 99.6
Meal 99.6
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Cafeteria 98.9
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Adult 95.3
Male 95.3
Man 95.3
Person 95.3
Adult 92.7
Male 92.7
Man 92.7
Person 92.7
Plate 91.4
Face 89.6
Head 89.6
Person 85.1
Baby 85.1
Dish 84.5
Person 83.1
Baby 83.1
Blade 80.7
Knife 80.7
Weapon 80.7
Dinner 76.4
Beverage 71.2
Coffee 71.2
Coffee Cup 71.2
Cutlery 66.9
Coffee Cup 61.6
Glass 57.7
Tabletop 57
Cup 57
Food Court 56.8
Home Decor 56.6
Linen 56.6
Alcohol 55.4
Spoon 55.4
Diner 55.3
Lunch 55.2

Clarifai
created on 2018-05-10

people 99.9
group 98.4
adult 97.3
group together 96.6
many 96.4
man 93.9
military 93.7
woman 93.3
administration 89.9
furniture 89.8
war 89.5
several 88.7
room 85.2
leader 82
sit 78.9
child 77.2
wear 70.9
recreation 69.9
soldier 68.8
vehicle 67.7

Imagga
created on 2023-10-07

dinner 59.2
meal 47.3
restaurant 46.8
table 44.7
food 39.7
plate 32.3
lunch 30.2
banquet 29.9
drink 26.7
dish 22
glass 20.7
wine 20
dining 20
celebration 19.9
setting 19.3
catering 18.6
eating 18.5
party 18.1
breakfast 17.5
nutriment 16.9
indoors 16.7
sitting 15.5
eat 15.1
cuisine 15.1
interior 15
adult 14.9
cup 14.6
board 14.3
luxury 13.7
service 13
stall 12.9
reception 12.7
fork 12.5
hotel 12.4
together 12.3
couple 12.2
men 12
women 11.9
knife 11.8
dine 11.8
people 11.7
meat 11.7
kitchen 11.6
serving 11.6
happy 11.3
smiling 10.9
napkin 10.8
holding 10.7
spoon 10.7
cooking 10.5
gourmet 10.2
wedding 10.1
person 10.1
cook 10.1
buffet 9.8
plates 9.8
serve 9.8
decor 9.7
china 9.7
arrangement 9.6
home 9.6
drinking 9.6
beverage 9.6
formal 9.6
bread 9.5
chair 9.5
man 9.5
decoration 9.4
supper 9.4
lifestyle 9.4
coffee 9.3
glasses 9.3
room 9.2
business 9.1
tablecloth 8.8
dishes 8.8
cutlery 8.8
30s 8.7
enjoying 8.5
adults 8.5
male 8.5
senior 8.4
delicious 8.3
couples 7.8
bowl 7.8
tabletop 7.8
tea 7.7
fancy 7.7
fine 7.6
healthy 7.6
friends 7.5
friendship 7.5
traditional 7.5
tableware 7.5
place 7.5
event 7.4
alcohol 7.4
inside 7.4
20s 7.3
indoor 7.3
color 7.2
container 7.2
smile 7.1
day 7.1

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 98.7
indoor 89.5
table 28.6
meal 18.2
dining table 14
cluttered 11.6

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Male, 88.5%
Calm 94.2%
Surprised 6.7%
Fear 6.1%
Sad 2.3%
Confused 1.8%
Disgusted 0.8%
Happy 0.6%
Angry 0.6%

AWS Rekognition

Age 18-24
Gender Male, 99.7%
Calm 97.9%
Surprised 6.3%
Fear 5.9%
Sad 2.8%
Disgusted 0.1%
Angry 0.1%
Confused 0%
Happy 0%

AWS Rekognition

Age 25-35
Gender Male, 99.5%
Sad 100%
Calm 15.5%
Surprised 6.3%
Fear 5.9%
Confused 2.1%
Disgusted 0.3%
Angry 0.2%
Happy 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%
Plate 91.4%
Baby 85.1%
Knife 80.7%
Coffee Cup 71.2%

Categories

Imagga

paintings art 77.7%
food drinks 20.2%
interior objects 1.6%