Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.945

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.945

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Food 100
Meal 100
Adult 99
Male 99
Man 99
Person 99
Restaurant 98.7
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Cafeteria 97.8
Adult 97.5
Male 97.5
Man 97.5
Person 97.5
Dish 97.5
Beverage 90.1
Coffee 90.1
Coffee Cup 90.1
Face 87.9
Head 87.9
Lunch 80.5
People 67.7
Dinner 66.6
Person 59.4
Bowl 57.8
Eating 56.8
Cutlery 56.7
Spoon 56.7
Food Court 56.4
Kitchen 55.5
Stew 55.4
Tabletop 55

Clarifai
created on 2018-05-11

people 100
group 99.6
group together 99.5
several 98.6
four 97.8
adult 97.2
man 96.9
sit 96.4
many 95.8
military 95.7
five 95
furniture 94.1
woman 94.1
war 94
administration 92.1
room 89.2
uniform 87.9
wear 85.5
recreation 85.2
three 85.1

Imagga
created on 2023-10-06

man 36.3
people 33.5
adult 32.1
table 31.3
male 30.6
person 26.8
sitting 24.9
business 23.1
businessman 22.9
office 21.3
indoors 21.1
couple 20.9
home 20.7
meeting 19.8
men 19.7
team 18.8
happy 18.2
professional 18
women 17.4
smiling 17.4
group 16.9
working 16.8
together 16.6
work 16.6
businesswoman 16.4
smile 15.7
desk 15.3
businesspeople 15.2
job 15
mature 14.9
30s 14.4
looking 14.4
executive 14.2
adults 14.2
friends 14.1
senior 14.1
discussion 13.6
child 13.5
teamwork 13
corporate 12.9
restaurant 12.9
suit 12.8
classroom 12.4
talking 12.4
student 12.2
laptop 12.2
clothing 12.1
indoor 11.9
drink 11.7
lifestyle 11.6
beverage 11
holding 10.7
40s 10.7
family 10.7
education 10.4
worker 10.3
manager 10.2
glasses 10.2
two 10.2
teacher 10.1
children 10
color 10
students 9.7
colleagues 9.7
room 9.7
success 9.7
workplace 9.5
paper 9.4
learning 9.4
casual 9.3
emotion 9.2
pen 9.2
20s 9.2
food 9.1
portrait 9.1
cheerful 8.9
kin 8.8
boy 8.7
daytime 8.7
love 8.7
studying 8.6
day 8.6
glass 8.6
happiness 8.6
staff 8.6
college 8.5
females 8.5
attractive 8.4
study 8.4
document 8.3
book 8.3
occupation 8.2
secretary 8.1
computer 8.1
dinner 8
kitchen 8
elementary age 7.9
discussing 7.9
couples 7.8
conference 7.8
having 7.7
books 7.7
partner 7.7
mid adult 7.7
dad 7.7
eating 7.6
friendship 7.5
coffee 7.4
cup 7.2
face 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
indoor 96.9
group 60.4
meal 27.7
dining table 9.7
crowd 0.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 29-39
Gender Female, 55.4%
Calm 99.5%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0.1%
Angry 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 48-56
Gender Male, 99.9%
Sad 99.6%
Calm 22.2%
Surprised 6.6%
Confused 6%
Fear 6%
Disgusted 2.4%
Angry 1.8%
Happy 0.3%

AWS Rekognition

Age 43-51
Gender Male, 99.5%
Calm 96.9%
Surprised 6.3%
Fear 5.9%
Sad 3%
Angry 0.3%
Disgusted 0.2%
Confused 0.1%
Happy 0.1%

AWS Rekognition

Age 31-41
Gender Male, 99%
Calm 61.3%
Happy 12.7%
Fear 9.4%
Sad 8.2%
Surprised 8.1%
Confused 2.4%
Disgusted 2%
Angry 1.1%

Microsoft Cognitive Services

Age 52
Gender Male

Microsoft Cognitive Services

Age 57
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99%
Male 99%
Man 99%
Person 99%
Coffee Cup 90.1%

Categories