Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.972

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.972

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-06

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Restaurant 99.7
Adult 99.1
Male 99.1
Man 99.1
Person 99.1
Adult 99
Male 99
Man 99
Person 99
Person 98.6
Person 98.2
Person 97.9
Adult 97.7
Male 97.7
Man 97.7
Person 97.7
Cafeteria 97
Face 89.8
Head 89.8
Adult 88
Person 88
Bride 88
Female 88
Wedding 88
Woman 88
Hospital 73.1
People 70.4
Food 67.4
Meal 67.4
Food Court 56.5
Chair 56.3
Tablecloth 55.1

Clarifai
created on 2018-05-11

people 99.8
group 99.5
group together 98.2
adult 97.9
man 97.5
administration 94.5
several 94.2
furniture 93.6
many 93.2
sit 92.8
child 92.3
war 92.2
woman 91.5
five 90.8
room 90.4
education 88.8
military 87.3
four 86.9
boy 82.8
recreation 82.5

Imagga
created on 2023-10-06

man 41
classroom 37.6
room 35
person 34.6
people 31.8
male 27.8
smiling 23.9
adult 23.4
sitting 23.2
home 21.5
businessman 21.2
hairdresser 20.7
patient 20.4
meeting 19.8
mature 19.5
couple 19.2
talking 19
business 18.8
happy 18.2
lifestyle 18.1
indoors 17.6
together 17.5
colleagues 17.5
table 17.3
office 16.5
group 16.1
family 16
men 15.5
senior 15
indoor 14.6
barbershop 14.6
case 14.3
smile 14.3
businesspeople 14.2
businesswoman 13.6
sick person 13.6
30s 13.5
working 13.3
cheerful 13
mother 13
20s 12.8
40s 12.7
communication 12.6
work 12.6
team 12.5
chair 12.5
shop 12.3
education 12.1
teamwork 12.1
computer 12
color 11.7
nurse 11.6
teacher 11.4
student 11
happiness 11
professional 10.8
couch 10.6
four 10.5
executive 10.5
old 10.5
adults 10.4
desk 10.4
child 10.3
women 10.3
day 10.2
mercantile establishment 9.7
discussion 9.7
retired 9.7
mid adult 9.6
elderly 9.6
school 9.4
casual 9.3
laptop 9.3
portrait 9.1
businessmen 8.8
two people 8.7
monk 8.7
boy 8.7
corporation 8.7
corporate 8.6
two 8.5
suit 8.1
worker 8.1
interior 8
restaurant 8
boardroom 7.9
business people 7.9
discussing 7.9
coworkers 7.9
grandfather 7.8
casual clothing 7.8
face 7.8
thirties 7.8
retirement 7.7
expression 7.7
health 7.6
friends 7.5
care 7.4
children 7.3
newspaper 7.1
love 7.1
medical 7.1

Microsoft
created on 2018-05-11

person 98.2
indoor 91.7
group 69.3
people 57.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-20
Gender Male, 96.4%
Sad 62.9%
Calm 51.5%
Surprised 10.9%
Confused 6.8%
Fear 6.4%
Angry 1.6%
Disgusted 0.7%
Happy 0.5%

AWS Rekognition

Age 20-28
Gender Male, 98.7%
Sad 100%
Surprised 6.3%
Fear 6.1%
Calm 2.5%
Confused 1.1%
Disgusted 0.3%
Angry 0.2%
Happy 0.2%

AWS Rekognition

Age 23-33
Gender Male, 89.4%
Calm 37.4%
Angry 21%
Sad 17.1%
Confused 10.2%
Surprised 9.3%
Disgusted 6.4%
Fear 6.2%
Happy 3.5%

Microsoft Cognitive Services

Age 5
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 99.1%
Male 99.1%
Man 99.1%
Person 99.1%
Bride 88%
Female 88%
Woman 88%
Chair 56.3%

Categories

Imagga

people portraits 52.2%
paintings art 44.7%
pets animals 2.4%