Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2571

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.2571

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-07

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Restaurant 100
Cafeteria 99.9
Adult 98.9
Male 98.9
Man 98.9
Person 98.9
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Adult 98.2
Male 98.2
Man 98.2
Person 98.2
Person 98.1
Person 98
Hospital 95.5
Face 89.5
Head 89.5
Adult 84.3
Person 84.3
Bride 84.3
Female 84.3
Wedding 84.3
Woman 84.3
Food 75.5
Meal 75.5
People 70.6
Food Court 57.9
Clinic 56.5
Cafe 55.4
Operating Theatre 55.2

Clarifai
created on 2018-05-10

people 99.9
group 99.2
adult 99.1
man 97.7
group together 96.3
furniture 95.8
woman 94.8
administration 93.1
sit 92.8
room 92.4
child 91.9
several 89.8
war 89.7
wear 89.1
five 88.5
many 88.4
recreation 87.7
two 86
boy 84.7
education 84.3

Imagga
created on 2023-10-07

person 39.6
patient 38.7
man 35.6
home 31.1
male 29.9
people 29.6
senior 27.2
adult 23.9
couple 23.5
barbershop 21.7
case 21.3
indoors 21.1
hairdresser 20.8
sick person 20.7
sitting 20.6
mature 19.5
smiling 18.1
hospital 17
shop 16.6
talking 16.2
family 16
medical 15.9
men 15.5
elderly 15.3
happy 15
together 14.9
room 14.9
retired 14.5
nurse 14.3
businessman 14.1
old 13.2
mercantile establishment 13.1
lifestyle 13
table 13
chair 12.9
health 12.5
adults 12.3
cheerful 12.2
professional 12.1
work 11.8
retirement 11.5
working 11.5
office 11.4
doctor 11.3
worker 11
grandfather 11
20s 11
indoor 11
70s 10.8
colleagues 10.7
30s 10.6
color 10.6
meeting 10.4
business 10.3
women 10.3
occupation 10.1
smile 10
mother 9.9
care 9.9
interior 9.7
two people 9.7
portrait 9.7
day 9.4
casual 9.3
face 9.2
group 8.9
caring 8.8
40s 8.8
place of business 8.7
couch 8.7
happiness 8.6
husband 8.6
communication 8.4
horizontal 8.4
child 8.4
teamwork 8.3
father 8.3
aged 8.1
restaurant 8.1
team 8.1
computer 8
uniform 7.9
love 7.9
discussing 7.9
kin 7.8
education 7.8
older 7.8
affectionate 7.7
stretcher 7.7
loving 7.6
illness 7.6
two 7.6
females 7.6
classroom 7.5
relaxed 7.5
surgeon 7.5
holding 7.4
inside 7.4
food 7.3
looking 7.2
student 7.1
handsome 7.1
to 7.1
medicine 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.4
group 59.2
restaurant 19.1
crowd 0.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-20
Gender Male, 99.7%
Calm 69.2%
Sad 43.3%
Surprised 6.8%
Fear 6.1%
Confused 3.4%
Angry 0.6%
Disgusted 0.3%
Happy 0.2%

AWS Rekognition

Age 21-29
Gender Male, 98.3%
Sad 52.2%
Calm 36.8%
Confused 29.4%
Fear 7.2%
Surprised 6.6%
Disgusted 1.3%
Angry 0.8%
Happy 0.4%

AWS Rekognition

Age 23-33
Gender Male, 71.6%
Sad 61.5%
Calm 46.3%
Happy 19.9%
Surprised 6.6%
Fear 6%
Confused 1.6%
Disgusted 0.8%
Angry 0.5%

AWS Rekognition

Age 16-24
Gender Female, 99%
Calm 85%
Sad 10.7%
Surprised 6.4%
Fear 6%
Disgusted 0.7%
Happy 0.7%
Confused 0.6%
Angry 0.4%

Microsoft Cognitive Services

Age 19
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.9%
Male 98.9%
Man 98.9%
Person 98.9%
Bride 84.3%
Female 84.3%
Woman 84.3%

Categories