Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.934

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.934

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Cafeteria 100
Indoors 100
Restaurant 100
Dining Table 100
Furniture 100
Table 100
Food 100
Meal 100
Architecture 100
Building 100
Dining Room 100
Room 100
Dish 98.7
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Adult 97.9
Male 97.9
Man 97.9
Person 97.9
Person 92.8
Face 92.6
Head 92.6
Adult 92.1
Person 92.1
Female 92.1
Woman 92.1
Eating 87.8
Person 79.9
Baby 79.9
Lunch 78.3
Dinner 78.3
People 68.4
Food Court 57.1
Smoke 57
Tabletop 56.7
Buffet 55.4

Clarifai
created on 2018-05-11

people 99.9
group together 99.3
group 99.3
man 97.5
adult 97.5
four 97
several 94.7
many 94.4
military 94.3
war 93.7
administration 93.2
woman 92.9
furniture 92.4
room 91.7
sit 91.6
five 91.5
leader 89.6
recreation 86.6
three 81.8
family 80.2

Imagga
created on 2023-10-05

man 37.1
people 34.1
male 32.6
adult 30.6
home 30.3
child 29.1
happy 28.9
sitting 28.4
indoors 27.3
smiling 26.8
couple 26.2
classroom 25.3
person 25.3
together 23.7
kin 22.7
family 22.3
room 19.9
children 18.2
boy 17.4
father 17.2
smile 17.1
mother 16.5
30s 15.4
teacher 15.1
dad 14.8
laptop 14.7
20s 14.7
40s 14.6
group 14.5
love 14.2
adults 14.2
desk 14.2
meeting 14.1
happiness 14.1
education 13.9
office 13.8
husband 13.7
computer 13.7
daughter 13.5
women 13.5
businessman 13.3
table 13.2
mature 13
cheerful 13
business 12.8
businesswoman 12.7
attractive 12.6
childhood 12.6
student 12.5
talking 12.4
lifestyle 12.3
color 12.2
parent 12.2
looking 12
fun 12
team 11.7
class 11.6
businesspeople 11.4
wife 11.4
senior 11.3
two 11
discussing 10.8
discussion 10.7
middle aged 10.7
kid 10.6
reading 10.5
casual 10.2
patient 10
holding 9.9
clothing 9.8
portrait 9.7
professional 9.7
couch 9.7
corporate 9.5
men 9.5
togetherness 9.5
day 9.4
friends 9.4
learning 9.4
cute 9.3
teamwork 9.3
relaxing 9.1
forties 8.8
thirties 8.8
two people 8.8
colleagues 8.8
studying 8.6
horizontal 8.4
coffee 8.3
nurse 8.3
book 8.2
indoor 8.2
playing 8.2
cup 8.1
son 8.1
working 8
work 7.9
pen 7.8
students 7.8
consultant 7.8
retired 7.8
daytime 7.7
eating 7.6
kids 7.5
drink 7.5
friendship 7.5
camera 7.4
school 7.4
emotion 7.4
executive 7.4
beverage 7.3
little 7.1
baby 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

wall 99.5
person 99.3
indoor 93.3
group 67
dish 40.9
meal 24.3
dining table 8.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-30
Gender Male, 99%
Calm 79.1%
Surprised 13.6%
Fear 6.3%
Disgusted 3.8%
Happy 3.5%
Sad 2.4%
Confused 1%
Angry 0.8%

AWS Rekognition

Age 48-56
Gender Male, 98.9%
Calm 48.6%
Confused 28.2%
Sad 11.5%
Fear 8%
Surprised 7.9%
Angry 1.2%
Disgusted 0.8%
Happy 0.8%

AWS Rekognition

Age 33-41
Gender Male, 99.8%
Calm 80.6%
Confused 11%
Surprised 6.6%
Fear 6%
Disgusted 5.9%
Sad 2.3%
Angry 0.6%
Happy 0.3%

AWS Rekognition

Age 22-30
Gender Male, 100%
Calm 99.9%
Surprised 6.3%
Fear 5.9%
Sad 2.2%
Confused 0%
Happy 0%
Disgusted 0%
Angry 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.6%
Male 98.6%
Man 98.6%
Person 98.6%
Female 92.1%
Woman 92.1%
Baby 79.9%