Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

Date

August 1938

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3046

Copyright

© President and Fellows of Harvard College

Human Generated Data

Title

Untitled (wheat harvest, central Ohio)

People

Artist: Ben Shahn, American 1898 - 1969

Date

August 1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Bernarda Bryson Shahn, P1970.3046

Copyright

© President and Fellows of Harvard College

Machine Generated Data

Tags

Amazon
created on 2023-10-05

Architecture 100
Building 100
Dining Room 100
Dining Table 100
Furniture 100
Indoors 100
Room 100
Table 100
Restaurant 100
Cafeteria 99.8
Adult 98.6
Male 98.6
Man 98.6
Person 98.6
Adult 98.5
Male 98.5
Man 98.5
Person 98.5
Adult 98.3
Male 98.3
Man 98.3
Person 98.3
Adult 98.3
Male 98.3
Man 98.3
Person 98.3
Person 97.5
Person 97
Hospital 95.6
Face 87.5
Head 87.5
Adult 87
Person 87
Bride 87
Female 87
Wedding 87
Woman 87
Food 69.3
Meal 69.3
Clinic 57.5
Food Court 57.3
People 56.3
Animal 55.8
Bird 55.8
Cafe 55.7
Operating Theatre 55.5
Art 55
Painting 55

Clarifai
created on 2018-05-10

people 99.8
group 99.1
adult 98.8
woman 97.2
man 97.1
group together 96.8
furniture 93.8
sit 93.3
room 93.3
several 91.4
administration 90
recreation 89.6
child 89.4
many 88.6
five 88.4
war 88.4
education 86.8
wear 86.3
indoors 81.5
four 81.5

Imagga
created on 2023-10-05

man 38.3
senior 34.7
person 34.2
people 32.9
male 30.6
home 27.9
couple 27.9
adult 23.9
barbershop 22.8
patient 22.5
mature 21.4
hairdresser 21.2
room 20.3
sitting 19.8
men 19.8
retired 19.4
indoors 19.3
smiling 18.8
elderly 18.2
old 18.1
shop 17.4
happy 16.9
together 16.6
family 15.1
lifestyle 14.5
retirement 14.4
talking 14.3
mercantile establishment 13.8
cheerful 13
table 13
medical 12.4
smile 12.1
occupation 11.9
chair 11.8
aged 11.8
nurse 11.7
restaurant 11.5
classroom 11.2
portrait 11
day 11
happiness 11
70s 10.8
grandfather 10.8
40s 10.7
care 10.7
older 10.7
businessman 10.6
health 10.4
love 10.3
professional 10.2
two 10.2
inside 10.1
case 9.8
grandmother 9.8
women 9.5
adults 9.5
color 9.5
doctor 9.4
horizontal 9.2
place of business 9.2
indoor 9.1
sick person 9
interior 8.8
seniors 8.8
working 8.8
two people 8.8
worker 8.7
30s 8.7
illness 8.6
office 8.5
meeting 8.5
teacher 8.5
drink 8.4
wine 8.3
meal 8.2
hospital 8.1
life 8.1
clinic 8.1
pensioner 8
business 7.9
food 7.9
caring 7.9
50s 7.8
casual clothing 7.8
60s 7.8
face 7.8
colleagues 7.8
check 7.7
husband 7.6
loving 7.6
casual 7.6
hand 7.6
alcohol 7.4
mother 7.3
group 7.3
computer 7.2
kin 7.2
work 7.1
to 7.1
medicine 7

Google
created on 2018-05-10

Microsoft
created on 2018-05-10

person 99.6
indoor 90.5
window 80.9
group 70.9
people 63.6
restaurant 26.1
crowd 0.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Male, 97.7%
Calm 92.1%
Surprised 6.7%
Fear 5.9%
Confused 3.4%
Sad 3.2%
Angry 0.2%
Happy 0.1%
Disgusted 0.1%

AWS Rekognition

Age 25-35
Gender Male, 97.4%
Sad 99.4%
Calm 34.7%
Surprised 6.4%
Fear 6%
Happy 1.1%
Confused 0.5%
Disgusted 0.3%
Angry 0.3%

AWS Rekognition

Age 19-27
Gender Female, 95.9%
Calm 91.1%
Fear 7%
Surprised 6.4%
Sad 2.6%
Happy 2.2%
Confused 0.9%
Disgusted 0.7%
Angry 0.7%

Microsoft Cognitive Services

Age 6
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Adult 98.6%
Male 98.6%
Man 98.6%
Person 98.6%
Bride 87%
Female 87%
Woman 87%
Bird 55.8%

Categories