Human Generated Data

Title

4 1/2 Out of Every Five

Date

1941

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Stephen Lee Taller Ben Shahn Archive, Gift of Dolores Taller, M24904

Copyright

© Estate of Ben Shahn / Artists Rights Society (ARS), New York

Human Generated Data

Title

4 1/2 Out of Every Five

People

Artist: Ben Shahn, American 1898 - 1969

Date

1941

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2019-10-29

Person 98.1
Human 98.1
Person 97.3
Person 95
Clothing 85.9
Apparel 85.9
Hat 85.9
Person 79.4
Hat 77.5
Art 75.4
Person 62.9

Clarifai
created on 2019-10-29

wear 99.4
man 97
outfit 96.5
people 96.4
adult 95.8
painting 94
one 91.4
illustration 90.9
no person 90.4
woman 90.1
gown (clothing) 89.5
art 88.3
veil 88
lid 86.6
facial expression 86.3
dress 85.8
outerwear 85.1
furniture 83.7
jacket 81.9
two 81.8

Imagga
created on 2019-10-29

man 34.4
business 32.2
male 31.9
businessman 30.9
people 30.7
jacket 29.7
person 27.1
team 25.1
silhouette 24
suit 23.3
clothing 22.6
group 20.1
businesswoman 20
teamwork 19.5
men 18.9
work 18.8
crowd 18.2
leader 16.4
boss 16.2
sexy 15.3
job 15
presentation 14.9
office 14.7
corporate 14.6
success 14.5
flag 14.3
garment 13.5
outfit 13.2
shopping 12.9
occupation 12.8
fashion 12.8
black 12.5
adult 12.5
clothes 12.2
sale 12
president 11.8
audience 11.7
cap 11.6
executive 11.6
patriotic 11.5
smile 11.4
happy 11.3
vivid 11.1
bag 11.1
women 11.1
casual 11
gown 11
successful 11
supporters 10.9
cheering 10.8
nighttime 10.8
speech 10.8
stadium 10.7
buying 10.6
vibrant 10.5
standing 10.4
nation 10.4
covering 10.4
lights 10.2
dress 9.9
design 9.6
academic gown 9.5
icon 9.5
symbol 9.4
happiness 9.4
buy 9.4
art 9.3
shirt 9.3
bright 9.3
mortarboard 8.8
professional 8.8
leadership 8.6
career 8.5
store 8.5
attractive 8.4
manager 8.4
holding 8.2
present 8.2
boy 8.1
smiling 8
hat 7.8
staff 7.8
businessmen 7.8
education 7.8
mall 7.8
shop 7.8
gift 7.7
consumer goods 7.6
meeting 7.5
company 7.4
lifestyle 7.2

Google
created on 2019-10-29

Microsoft
created on 2019-10-29

clothing 92.4
text 91.8
man 90.5
person 78.3
hat 78.1
suit 75.8
old 72.2
clothes 17.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 96.8%
Disgusted 0.1%
Fear 0%
Sad 0.1%
Happy 0.1%
Angry 0.1%
Calm 99.4%
Confused 0.1%
Surprised 0.1%

AWS Rekognition

Age 32-48
Gender Male, 99.9%
Sad 1.6%
Surprised 5.8%
Happy 0%
Confused 8.4%
Fear 0.7%
Disgusted 0.3%
Calm 82.7%
Angry 0.4%

AWS Rekognition

Age 38-56
Gender Male, 99%
Disgusted 0.1%
Angry 0.2%
Sad 25%
Fear 0%
Confused 0.5%
Surprised 0%
Happy 0%
Calm 74.2%

AWS Rekognition

Age 56-74
Gender Male, 98.2%
Disgusted 0.1%
Surprised 0.6%
Confused 1.5%
Sad 1.5%
Happy 0%
Fear 0.1%
Angry 0.5%
Calm 95.5%

Microsoft Cognitive Services

Age 31
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.1%
Hat 85.9%

Captions

Microsoft

a group of people wearing costumes 65%
a group of men wearing suits and ties 64.9%
a group of people posing for a photo 64.8%

Text analysis

Amazon

'