Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.944

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.944

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Restaurant 100
Cafeteria 99.8
Food 99.4
Meal 99.4
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98.8
Male 98.8
Man 98.8
Person 98.8
Adult 98.4
Male 98.4
Man 98.4
Person 98.4
Adult 98
Male 98
Man 98
Person 98
Person 97.4
Adult 97.1
Male 97.1
Man 97.1
Person 97.1
Person 95.4
Food Court 92.7
Face 88.2
Head 88.2
Dinner 79.9
Person 79.8
Dish 65.7
People 56.1
Cafe 55.1

Clarifai
created on 2018-05-11

people 100
group 99.3
adult 99
group together 98.5
war 96.6
many 96.4
several 96.2
man 96.2
military 95.9
woman 94.7
administration 93.8
four 93.4
five 91.9
sit 91.5
room 91
furniture 89.8
child 85.8
wear 84.4
three 83.3
leader 82.9

Imagga
created on 2023-10-06

man 39.1
people 34
male 33.4
room 29.5
table 28.2
hairdresser 26.8
person 25.2
home 23.1
office 22.9
adult 22.6
restaurant 22.4
together 21.9
business 20.7
sitting 20.6
classroom 20.4
indoors 20.2
happy 20.1
businesswoman 20
meeting 19.8
businessman 19.4
team 18.8
group 18.5
couple 18.3
smiling 18.1
men 18
working 17.7
drink 16.7
women 16.6
work 16.5
coffee 15.7
smile 15.7
meal 15.5
lifestyle 15.2
interior 15
teamwork 13.9
food 13.6
desk 13.4
professional 13.4
talking 13.3
businesspeople 13.3
executive 13
laptop 12.8
family 12.5
corporate 12
computer 12
chair 12
20s 11.9
dinner 11.9
worker 11.9
breakfast 11.7
presentation 11.2
wine 11.1
indoor 11
40s 10.7
cheerful 10.6
adults 10.4
senior 10.3
mature 10.2
glass 10.1
eating 10.1
shop 10.1
kitchen 9.8
modern 9.8
attractive 9.8
portrait 9.7
success 9.7
service 9.3
successful 9.2
confident 9.1
holding 9.1
cup 9
suit 9
job 8.8
colleagues 8.7
having 8.7
student 8.7
busy 8.7
party 8.6
dining 8.6
friends 8.5
communication 8.4
old 8.4
teacher 8.2
children 8.2
beverage 8.1
lunch 8
standing 7.8
hospital 7.7
30s 7.7
two 7.6
enjoying 7.6
clothing 7.6
document 7.4
mother 7.3
paper 7.2
patient 7.2
handsome 7.1
child 7.1
day 7.1

Google
created on 2018-05-11

Microsoft
created on 2018-05-11

person 99.9
table 97.5
sitting 97.3
group 91.7
people 82.3
restaurant 38.9
meal 33.5
family 30.5
dining table 8.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 20-28
Gender Male, 99.9%
Calm 100%
Surprised 6.3%
Fear 5.9%
Sad 2.1%
Angry 0%
Confused 0%
Disgusted 0%
Happy 0%

AWS Rekognition

Age 31-41
Gender Male, 99.8%
Calm 56%
Confused 22.9%
Sad 11.7%
Surprised 7.9%
Fear 6.3%
Disgusted 1.9%
Angry 1.7%
Happy 0.9%

AWS Rekognition

Age 26-36
Gender Male, 97.9%
Calm 81.2%
Sad 9.5%
Surprised 6.9%
Fear 6.5%
Confused 2.9%
Happy 1.1%
Disgusted 0.6%
Angry 0.5%

AWS Rekognition

Age 23-33
Gender Male, 100%
Calm 91.2%
Surprised 7.7%
Fear 6%
Confused 3.9%
Sad 2.3%
Angry 0.6%
Disgusted 0.5%
Happy 0.4%

AWS Rekognition

Age 21-29
Gender Male, 59.8%
Calm 77.4%
Angry 13.8%
Surprised 7.3%
Fear 7.2%
Sad 2.3%
Happy 2.1%
Confused 0.6%
Disgusted 0.4%

Microsoft Cognitive Services

Age 44
Gender Male

Microsoft Cognitive Services

Age 26
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%

Categories