Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.935

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.935

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Food 100
Meal 100
Cafeteria 100
Restaurant 100
Dish 99.7
Lunch 98.7
People 98.5
Person 98.5
Adult 98.5
Male 98.5
Man 98.5
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Adult 98.2
Male 98.2
Man 98.2
Person 98
Adult 98
Male 98
Man 98
Person 97.6
Adult 97.6
Male 97.6
Man 97.6
Person 97.5
Baby 97.5
Person 96.6
Adult 96.6
Male 96.6
Man 96.6
Face 92.4
Head 92.4
Dinner 80.6
Person 80.2
Baby 80.2
Eating 66.8
Tabletop 57.9
Food Court 57.5
Cutlery 56.5
Kitchen 56.4
Spoon 55.7
Factory 55.2
Manufacturing 55.2

Clarifai
created on 2018-05-11

people 99.9
group 99
group together 98
adult 97.9
man 96.5
war 94.7
room 94.3
furniture 93.8
administration 93.2
child 92.2
several 91.7
military 90.6
uniform 88.7
many 88.1
recreation 87
sit 86.5
education 84.6
five 84
leader 84
four 83.1

Imagga
created on 2023-10-07

man 43
people 36.3
male 34.2
adult 31
person 29.7
meeting 27.3
couple 27
sitting 26.6
home 25.5
smiling 24.6
indoors 23.7
classroom 23.4
men 23.2
office 22.9
table 21.9
together 21
happy 20.7
room 20.4
business 20.1
team 18.8
senior 18.8
computer 18.5
talking 18.1
professional 18
teacher 17.8
businesswoman 17.3
businessman 16.8
women 16.6
40s 16.6
group 16.1
desk 15.5
restaurant 15.4
adults 15.2
color 15
smile 15
work 14.9
teamwork 14.8
education 14.7
student 14.7
businesspeople 14.2
family 14.2
mature 14
child 13.9
executive 13.8
lifestyle 13.7
colleagues 13.6
working 13.3
laptop 13.2
corporate 12.9
cheerful 12.2
day 11.8
discussion 11.7
retired 11.6
friends 11.3
coffee 11.1
emotion 11.1
occupation 11
communication 10.9
drink 10.9
holding 10.7
daytime 10.6
elderly 10.5
love 10.3
book 10.2
happiness 10.2
two 10.2
indoor 10.1
children 10
suit 10
cup 10
discussing 9.8
husband 9.8
job 9.7
class 9.6
30s 9.6
looking 9.6
workplace 9.5
college 9.5
clothes 9.4
clothing 9.3
presentation 9.3
20s 9.2
beverage 9.1
dinner 9
technology 8.9
forties 8.8
days 8.8
dad 8.8
casual clothing 8.8
emotions 8.7
books 8.7
kin 8.6
reading 8.6
wife 8.5
meal 8.5
writing 8.5
grandfather 8.5
learning 8.5
educator 8.5
old 8.4
to 8
boardroom 7.9
boy 7.8
educational 7.8
partners 7.8
corporation 7.7
retirement 7.7
studying 7.7
chair 7.7
drinking 7.7
food 7.6
togetherness 7.6
relationship 7.5
friendship 7.5
portrait 7.1
father 7.1
interior 7.1
patient 7.1
attractive 7

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
indoor 95.5
wall 95.4
group 69.9
meal 32.1
family 15.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 35-43
Gender Male, 96.2%
Calm 48.6%
Sad 28.7%
Confused 20.2%
Surprised 8.9%
Fear 6.9%
Happy 1.6%
Angry 1.4%
Disgusted 1.1%

AWS Rekognition

Age 36-44
Gender Male, 99.9%
Sad 100%
Surprised 6.3%
Fear 5.9%
Calm 0.9%
Confused 0.7%
Angry 0.2%
Disgusted 0.1%
Happy 0%

AWS Rekognition

Age 13-21
Gender Male, 96.7%
Calm 83.3%
Sad 8.2%
Fear 7.2%
Surprised 6.6%
Happy 2.3%
Angry 0.3%
Confused 0.3%
Disgusted 0.2%

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 95.6%
Surprised 6.4%
Fear 5.9%
Confused 3.1%
Sad 2.2%
Angry 0.4%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 20-28
Gender Male, 95.7%
Calm 78.4%
Sad 27.9%
Surprised 6.3%
Fear 6%
Disgusted 0.5%
Angry 0.2%
Confused 0.2%
Happy 0.2%

Microsoft Cognitive Services

Age 69
Gender Male

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.5%
Adult 98.5%
Male 98.5%
Man 98.5%
Baby 97.5%

Categories