Human Generated Data

Title

Dinner during wheat harvest time, central Ohio

Date

1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3028

Human Generated Data

Title

Dinner during wheat harvest time, central Ohio

People

Artist: Ben Shahn, American 1898 - 1969

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3028

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 99.9
Person 99.9
Person 99.8
Person 99.6
Person 99.6
Restaurant 98.5
Meal 86.5
Food 86.5
Cafeteria 81.1
Person 79.8
Dish 73.5
Bowl 73.2
Food Court 66
Pottery 61.5
Room 56.9
Indoors 56.9
People 55.9

Clarifai
created on 2023-10-15

people 100
child 99.8
adult 99.4
group 99.2
group together 99
furniture 98.8
man 97.7
boy 97.6
room 97
son 96.2
monochrome 95.9
offspring 95.5
woman 94.8
sit 92.6
portrait 92.1
family 91.9
war 91.7
four 89.3
three 89.2
indoors 88.4

Imagga
created on 2021-12-15

man 42.4
male 36.2
meeting 34.9
people 34.6
business 31.6
person 31.3
office 29.6
adult 29.3
team 27.8
businessman 27.4
table 26.2
businesswoman 25.5
couple 24.4
working 23.9
home 23.1
indoors 22.9
group 22.6
happy 21.9
businesspeople 21.8
work 21.2
together 21
senior 20.6
men 20.6
colleagues 20.4
smiling 20.3
professional 18
sitting 18
30s 17.3
corporate 17.2
laptop 16.9
executive 16.8
teamwork 16.7
40s 16.6
discussion 16.6
desk 16.4
talking 16.2
adults 15.2
room 15.1
classroom 14.7
worker 14.5
smile 14.3
mature 14
discussing 13.8
computer 13.7
job 13.3
communication 12.6
workplace 12.4
scholar 12.3
20s 11.9
child 11.7
busy 11.6
lifestyle 11.6
family 11.6
clothing 11.3
manager 11.2
presentation 11.2
student 11.2
portrait 11
day 11
conference 10.8
retired 10.7
cheerful 10.6
females 10.4
friends 10.3
women 10.3
suit 10
color 10
intellectual 9.9
kin 9.9
partners 9.7
success 9.7
daytime 9.6
elderly 9.6
education 9.5
happiness 9.4
casual 9.3
grandfather 9.3
company 9.3
coffee 9.3
generator 9.2
successful 9.2
beverage 9
four people 8.9
teacher 8.8
businessmen 8.8
husband 8.7
partner 8.7
four 8.6
wife 8.5
two 8.5
writing 8.5
learning 8.5
food 8.5
restaurant 8.4
drink 8.4
document 8.4
glasses 8.3
camera 8.3
planner 8.3
book 8.2
children 8.2
boardroom 7.9
elementary age 7.9
dad 7.9
business people 7.9
coworkers 7.9
appointment 7.9
forties 7.8
days 7.8
middle aged 7.8
conversation 7.8
emotions 7.8
secretary 7.7
horizontal 7.5
dinner 7.5
clothes 7.5
emotion 7.4
occupation 7.3
friendly 7.3
looking 7.2
blond 7.2
love 7.1
paper 7.1

Google
created on 2021-12-15

White 92.2
Tableware 89.5
Table 88.6
Sharing 82.5
Food 79.8
Cooking 78.2
Art 77.3
Snapshot 74.3
Plate 74.2
Dishware 73.7
Dish 72.7
Monochrome photography 72.6
Monochrome 70.8
Room 69.3
Cuisine 68.9
Event 68.8
Serveware 68.6
Vintage clothing 67.3
Comfort food 65.7
Chair 65.6

Microsoft
created on 2021-12-15

person 99.7
text 99.2
clothing 98.6
indoor 97.4
human face 94.6
man 92.8
food 88.9
group 69.2
meal 22.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 50-68
Gender Male, 94.6%
Sad 92%
Calm 7.5%
Confused 0.3%
Angry 0.2%
Happy 0%
Surprised 0%
Fear 0%
Disgusted 0%

AWS Rekognition

Age 32-48
Gender Male, 98.5%
Calm 92.8%
Sad 3.6%
Angry 2.6%
Happy 0.3%
Fear 0.2%
Confused 0.2%
Disgusted 0.1%
Surprised 0.1%

AWS Rekognition

Age 38-56
Gender Male, 97.9%
Calm 55.8%
Happy 24%
Angry 12%
Confused 2.6%
Sad 2.4%
Surprised 1.2%
Disgusted 1.2%
Fear 0.8%

AWS Rekognition

Age 26-42
Gender Female, 56.9%
Calm 52%
Sad 43.9%
Happy 2.4%
Confused 0.7%
Fear 0.3%
Surprised 0.3%
Disgusted 0.1%
Angry 0.1%

Microsoft Cognitive Services

Age 57
Gender Male

Microsoft Cognitive Services

Age 57
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.9%

Categories

Imagga

people portraits 51.6%
paintings art 43.8%
food drinks 2.6%