Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.966

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.966

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Restaurant 100
Cafeteria 99.6
Food 99.5
Meal 99.5
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Adult 98.1
Male 98.1
Man 98.1
Person 98.1
Adult 97.9
Male 97.9
Man 97.9
Person 97.9
Adult 97.3
Male 97.3
Man 97.3
Person 97.3
Dinner 94.8
Person 94.7
Person 92.5
Face 89
Head 89
Person 87.4
Food Court 83.5
Dish 76.7
People 72.8
Diner 55.9
Lunch 55.6
Home Decor 55.5
Linen 55.5
Cafe 55.3

Clarifai
created on 2018-05-11

people 100
group 99.5
group together 99.4
adult 98.2
many 97
man 96.3
woman 96.2
sit 94.9
furniture 94.5
several 94.3
military 94.2
five 94
room 93.7
war 93.3
four 92.2
administration 90.6
recreation 90.4
child 87.4
leader 82.4
education 77

Imagga
created on 2023-10-05

man 39
male 29.8
people 29.5
table 29.2
restaurant 27.2
adult 25.7
room 24.1
person 22.6
business 22.5
dinner 21.3
indoors 21.1
sitting 20.6
together 20.1
office 20
couple 20
meeting 19.8
happy 19.4
businessman 18.5
meal 18.4
home 18.3
lifestyle 18.1
women 17.4
smiling 17.4
team 17
interior 16.8
40s 16.5
businesswoman 16.4
men 16.3
working 15.9
food 15.7
family 15.1
work 14.9
drink 14.2
20s 13.7
laptop 13.7
desk 13.4
talking 13.3
group 12.9
shop 12.7
classroom 12.6
30s 12.5
businesspeople 12.3
adults 12.3
four people 11.8
colleagues 11.7
computer 11.2
inside 11
portrait 11
glass 11
lunch 10.8
hairdresser 10.7
smile 10.7
dining 10.5
education 10.4
friends 10.3
elegant 10.3
mature 10.2
teamwork 10.2
eating 10.1
children 10
daughter 9.8
chair 9.7
child 9.7
mother 9.5
corporate 9.4
wine 9.3
coffee 9.3
worker 9.2
cafeteria 9.2
hospital 9.2
indoor 9.1
modern 9.1
holding 9.1
discussion 8.8
having 8.7
daytime 8.7
class 8.7
building 8.6
formal 8.6
casual 8.5
clothing 8.4
barbershop 8.3
breakfast 8.3
successful 8.2
plate 7.9
elementary age 7.9
boardroom 7.9
mercantile establishment 7.9
day 7.8
days 7.8
couples 7.8
catering 7.8
party 7.7
setting 7.7
professional 7.7
two 7.6
college 7.6
friendship 7.5
executive 7.5
student 7.5
presentation 7.4
service 7.4
banquet 7.4
center 7.3
kitchen 7.3
success 7.2
color 7.2
celebration 7.2
father 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 98.7
sitting 94.8
group 90.7
indoor 89.6
people 76.8
table 31.8
restaurant 18.5
dining table 9.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 30-40
Gender Male, 100%
Calm 88.9%
Surprised 7%
Fear 6.1%
Confused 4.6%
Sad 3.2%
Angry 0.8%
Disgusted 0.6%
Happy 0.5%

AWS Rekognition

Age 20-28
Gender Male, 99.6%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Confused 0%
Disgusted 0%
Angry 0%
Happy 0%

AWS Rekognition

Age 18-26
Gender Male, 99.9%
Calm 58%
Sad 30.4%
Confused 10.6%
Surprised 8%
Fear 6.6%
Happy 3.3%
Disgusted 1.5%
Angry 0.6%

AWS Rekognition

Age 23-31
Gender Male, 99.9%
Surprised 53.6%
Calm 36.3%
Confused 12.9%
Fear 8%
Sad 3.9%
Angry 3.2%
Happy 3%
Disgusted 2.4%

AWS Rekognition

Age 22-30
Gender Male, 98.4%
Calm 72.8%
Happy 21.8%
Fear 6.7%
Surprised 6.5%
Sad 2.7%
Disgusted 0.7%
Confused 0.4%
Angry 0.3%

Microsoft Cognitive Services

Age 23
Gender Male

Microsoft Cognitive Services

Age 46
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.6%
Male 98.6%
Man 98.6%
Person 98.6%

Categories