Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.946

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.946

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Food 100
Meal 100
Dish 100
Lunch 100
Dining Table 99.8
Furniture 99.8
Table 99.8
Indoors 99.1
Restaurant 99.1
Cafeteria 99
Adult 99
Male 99
Man 99
Person 99
Cutlery 98.9
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Bowl 98
Adult 96.3
Male 96.3
Man 96.3
Person 96.3
Architecture 92.6
Building 92.6
Dining Room 92.6
Room 92.6
Spoon 91.8
Face 91.3
Head 91.3
Person 85.5
Plate 61.4
Beverage 61.2
Coffee 61.2
Coffee Cup 61.2
Stew 57.8
Dinner 57.6
Eating 56.5
Food Court 56.1
Fork 56

Clarifai
created on 2018-05-11

people 99.8
group 98.1
adult 97.6
group together 95.8
man 95.5
many 92.7
woman 92.5
food 92.1
several 91.9
restaurant 90.4
furniture 86.3
military 85.6
wear 82
table 81.5
two 80
meal 78.7
place setting 78.5
tableware 77.3
administration 77.3
four 75.1

Imagga
created on 2023-10-06

banquet 81.1
dinner 77.1
meal 67.4
food 52.9
restaurant 38.2
nutriment 37.1
table 34.8
plate 32.4
lunch 27.3
eating 26.1
board 25.4
drink 25.1
couple 20.9
wine 19.6
dish 19.3
breakfast 18.7
cuisine 18.6
cup 18.3
sitting 18.1
indoors 16.7
glass 16.4
kitchen 15.2
smiling 15.2
together 14.9
adult 14.9
man 14.8
spoon 14.3
dining 14.3
bowl 14.1
delicious 14
coffee 13.9
happy 13.8
celebration 13.6
meat 13.5
fork 13.4
people 13.4
lifestyle 13
beverage 12.9
men 12.9
home 12.8
traditional 12.5
holding 12.4
friends 12.2
gourmet 11.9
serving 11.6
enjoying 11.4
women 11.1
healthy 10.7
hotel 10.5
enjoyment 10.3
male 9.9
knife 9.9
dishes 9.8
setting 9.6
30s 9.6
tea 9.5
happiness 9.4
friendship 9.4
cook 9.2
person 9.1
dessert 9.1
interior 8.9
plates 8.8
couples 8.8
catering 8.8
alcohol 8.8
party 8.6
luxury 8.6
smile 8.6
cooked 8.5
two 8.5
eat 8.4
service 8.3
20s 8.3
salad 8.2
room 8.2
vegetables 8.2
diet 8.2
group 8.1
family 8
dining room 7.9
beverages 7.9
drinks 7.8
having 7.8
waist 7.8
culture 7.7
prepared 7.6
adults 7.6
cake 7.5
bread 7.4
cheerful 7.3
indoor 7.3
color 7.2
sweet 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 100
table 98.3
dish 47.7
meal 32

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 43-51
Gender Male, 99.7%
Calm 78.2%
Sad 25.6%
Surprised 6.4%
Fear 6%
Confused 1.3%
Angry 0.3%
Happy 0.3%
Disgusted 0.2%

AWS Rekognition

Age 53-61
Gender Female, 52.3%
Angry 78.6%
Calm 11.6%
Surprised 6.4%
Fear 6%
Disgusted 4.6%
Sad 3.3%
Confused 0.8%
Happy 0.6%

Microsoft Cognitive Services

Age 44
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99%
Male 99%
Man 99%
Person 99%
Plate 61.4%
Coffee Cup 61.2%

Categories

Imagga

food drinks 98.5%