Human Generated Data

Title

Demonstration

Date

1933

People

Artist: Ben Shahn, American 1898 - 1969

Classification

Paintings

Credit Line

Harvard Art Museums/Fogg Museum, Richard Norton Memorial Fund, 2011.12

Copyright

© Estate of Ben Shahn / Artists Rights Society (ARS), New York

Human Generated Data

Title

Demonstration

People

Artist: Ben Shahn, American 1898 - 1969

Date

1933

Classification

Paintings

Machine Generated Data

Tags

Amazon
created on 2019-04-06

Person 93.6
Human 93.6
Person 92.7
Art 89.4
Person 88.3
Person 88.2
Person 83
Person 78.3
Person 78.2
Person 78
Crowd 73.1
Mural 70.6
Person 70.6
Face 68.3
Person 67.7
Painting 67.3
Head 66.8
Person 63.9
Advertisement 63.3
Collage 63.3
Poster 63.3
Person 43.3

Clarifai
created on 2018-03-23

people 99.7
group 99.2
adult 99.2
portrait 98.7
man 98.6
woman 95.4
art 94.8
illustration 93.4
facial expression 93.1
veil 92.8
painting 92.2
leader 91.7
print 91.7
religion 91.5
wear 89.6
gang 86
business 85.1
many 84.9
offense 83.1
group together 82.4

Imagga
created on 2018-03-23

mosaic 33.1
graffito 32.2
art 27
decoration 26.2
jigsaw puzzle 24.6
comic book 24.4
religion 23.3
church 23.1
puzzle 19.4
money 18.7
vintage 17.3
currency 17
book jacket 16.4
covering 16.1
old 16
postmark 15.8
stamp 15.5
mail 15.3
transducer 15.1
portrait 14.9
masterpiece 14.8
letter 14.7
game 14.2
man 14.1
black 14
golden 13.7
jacket 13.7
museum 13.7
envelope 13.6
dollar 13
culture 12.8
face 12.8
postage 12.8
postal 12.7
bible 12.7
icon 12.7
painter 12.5
god 12.4
cash 11.9
finance 11.8
dollars 11.6
spirituality 11.5
colorful 11.5
painted 11.4
electrical device 11.4
male 11.3
close 10.8
financial 10.7
travel 10.5
bill 10.4
one 10.4
prayer rug 10.4
banking 10.1
people 10
global 10
shows 9.8
printed 9.8
orthodox 9.8
print media 9.8
wrapping 9.7
artist 9.6
saint 9.6
faith 9.6
post 9.5
antique 9.5
closeup 9.4
symbol 9.4
religious 9.4
mask 9.1
painting 9
bank 8.9
prophet 8.9
floor cover 8.7
holy 8.7
person 8.6
unique 8.5
business 8.5
adult 8.4
rug 8.3
retro 8.2
wealth 8.1
office 8
byzantine 7.9
philately 7.9
circa 7.9
known 7.9
stamps 7.9
device 7.8
paintings 7.8
ancient 7.8
banknote 7.8
wall 7.7
notes 7.7
card 7.6
exchange 7.6
fine 7.6
paint 7.2
detail 7.2
sexy 7.2

Google
created on 2018-03-23

art 90.2
mural 82.2
painting 73.2
artwork 61.9
visual arts 55.7

Microsoft
created on 2018-03-23

text 94.6

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-43
Gender Female, 77%
Confused 5.1%
Happy 7.3%
Disgusted 1.6%
Angry 3.5%
Surprised 9.1%
Sad 4.8%
Calm 68.5%

AWS Rekognition

Age 26-43
Gender Female, 50.6%
Sad 4.3%
Disgusted 0.3%
Confused 7.4%
Angry 2%
Happy 1.6%
Surprised 1.3%
Calm 83.2%

AWS Rekognition

Age 35-53
Gender Male, 71.6%
Angry 2.7%
Happy 68.9%
Surprised 5.3%
Sad 3.6%
Disgusted 1.8%
Calm 4.4%
Confused 13.2%

AWS Rekognition

Age 26-43
Gender Male, 84.8%
Disgusted 7.4%
Angry 10.9%
Happy 36.6%
Calm 14%
Surprised 13.5%
Sad 3.3%
Confused 14.4%

AWS Rekognition

Age 48-68
Gender Male, 99.3%
Angry 7.2%
Calm 63.5%
Confused 16.2%
Surprised 2.1%
Disgusted 3.1%
Happy 1.3%
Sad 6.6%

AWS Rekognition

Age 45-65
Gender Female, 85%
Sad 6%
Angry 3.7%
Happy 16.2%
Calm 58.1%
Disgusted 4.2%
Confused 6.4%
Surprised 5.5%

AWS Rekognition

Age 26-43
Gender Male, 89.4%
Confused 30.1%
Sad 11.8%
Calm 13.1%
Disgusted 15.3%
Angry 17.3%
Happy 2.9%
Surprised 9.5%

AWS Rekognition

Age 35-52
Gender Female, 83%
Happy 19.7%
Sad 40%
Angry 13.2%
Surprised 5.3%
Disgusted 4.9%
Calm 9.5%
Confused 7.3%

AWS Rekognition

Age 57-77
Gender Male, 99%
Angry 2.9%
Calm 59.2%
Confused 7.9%
Happy 19.2%
Disgusted 1%
Surprised 4.9%
Sad 4.8%

AWS Rekognition

Age 26-43
Gender Male, 97.9%
Disgusted 0.1%
Happy 0.8%
Surprised 0.4%
Sad 2.2%
Calm 94.2%
Angry 0.7%
Confused 1.5%

AWS Rekognition

Age 26-43
Gender Female, 61.6%
Disgusted 3%
Happy 1.4%
Calm 81.6%
Sad 3%
Surprised 5%
Angry 2%
Confused 4%

AWS Rekognition

Age 35-52
Gender Female, 81.2%
Angry 8.5%
Sad 7.2%
Confused 2.8%
Surprised 4.5%
Happy 2%
Disgusted 66.8%
Calm 8.2%

AWS Rekognition

Age 26-43
Gender Male, 76%
Angry 6.2%
Happy 9.7%
Calm 20.4%
Surprised 48.4%
Sad 2.3%
Disgusted 4.5%
Confused 8.5%

AWS Rekognition

Age 26-43
Gender Female, 74.5%
Sad 17.5%
Disgusted 23.1%
Happy 3.2%
Angry 23.1%
Confused 5.9%
Calm 23.6%
Surprised 3.6%

Microsoft Cognitive Services

Age 70
Gender Male

Microsoft Cognitive Services

Age 79
Gender Male

Microsoft Cognitive Services

Age 56
Gender Male

Microsoft Cognitive Services

Age 36
Gender Male

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 60
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 93.6%
Painting 67.3%

Captions

Microsoft

a group of people posing for the camera 81.7%
a group of people posing for a picture 81.6%
a group of people posing for a photo 74.1%

Text analysis

Amazon

soosocbob