Human Generated Data

Title

Ingres and the Lovers

Date

1982

People

Artist: Stephen Curtis, American 1946 - 1996

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Andrew Wyeth Fund for the purchase for American drawings, M20484

Human Generated Data

Title

Ingres and the Lovers

People

Artist: Stephen Curtis, American 1946 - 1996

Date

1982

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Andrew Wyeth Fund for the purchase for American drawings, M20484

Machine Generated Data

Tags

Amazon
created on 2019-11-02

Human 99.6
Person 99.6
Art 98.6
Person 96.8
Painting 96.6
Person 94
Person 89
Person 74.4
Mural 58.3
Person 55.2

Clarifai
created on 2019-11-02

people 99.7
nude 99.5
art 99.4
adult 99.3
woman 99
man 99
group 98.6
illustration 97.4
portrait 96
print 95.7
painting 95.1
two 94.8
monochrome 88.6
religion 88.3
Renaissance 87.9
one 86.9
child 86.7
old 86
three 85.6
vintage 85.3

Imagga
created on 2019-11-02

sand 48.1
soil 43
earth 32.1
statue 27.6
sculpture 26.3
money 24.7
cash 23.8
religion 23.3
dollar 23.2
ancient 22.5
currency 22.4
art 21.7
bill 19
culture 18.8
temple 18.4
god 17.2
banking 16.6
stone 16.5
architecture 16.4
bank 16.4
finance 16.1
dollars 15.5
old 15.3
paper 14.9
close 14.8
us 14.5
history 14.3
financial 14.3
travel 14.1
religious 14.1
monument 14
spirituality 13.4
carving 13.4
famous 13
business 12.8
wealth 12.6
holy 12.5
figure 12.1
jigsaw puzzle 11.9
tourism 11.6
face 11.4
one 11.2
man 10.8
hundred 10.7
pay 10.6
puzzle 10.4
savings 10.3
decoration 10
franklin 9.8
portrait 9.7
banknote 9.7
finances 9.6
comic book 9.5
symbol 9.4
museum 9.3
historic 9.2
city 9.1
banknotes 8.8
bills 8.7
payment 8.7
loan 8.6
exchange 8.6
memorial 8.5
male 8.5
mosaic 8.4
rich 8.4
church 8.3
game 8.2
success 8.1
creation 8
marble 7.9
bust 7.9
carved 7.8
states 7.7
notes 7.7
faith 7.7
united 7.6
capital 7.6
vintage 7.4
structure 7.4
economy 7.4
closeup 7.4
newspaper 7.3
sketch 7.3
people 7.3

Google
created on 2019-11-02

Microsoft
created on 2019-11-02

text 98.9
book 95.5
painting 91
person 88.6
drawing 83.9
human face 75
woman 70.5
old 69.4
gallery 67.4
posing 50.9
different 38.6
vintage 33.5
several 16.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-30
Gender Female, 54.7%
Sad 45.1%
Angry 45%
Calm 54.5%
Confused 45.1%
Fear 45.2%
Surprised 45.1%
Disgusted 45%
Happy 45%

AWS Rekognition

Age 27-43
Gender Female, 52.6%
Confused 45.2%
Fear 45.2%
Surprised 45.7%
Happy 45.7%
Calm 52.4%
Angry 45.2%
Sad 45.3%
Disgusted 45.1%

AWS Rekognition

Age 23-37
Gender Female, 54.6%
Confused 45%
Happy 45%
Sad 52.8%
Calm 45%
Angry 45%
Surprised 45%
Disgusted 45%
Fear 47.2%

AWS Rekognition

Age 13-23
Gender Female, 54.1%
Calm 53.6%
Sad 46%
Fear 45%
Disgusted 45%
Angry 45.3%
Surprised 45%
Confused 45%
Happy 45%

AWS Rekognition

Age 23-35
Gender Male, 74.6%
Calm 4.6%
Happy 0.1%
Sad 92.4%
Fear 1.7%
Surprised 0.3%
Confused 0.3%
Disgusted 0.1%
Angry 0.5%

Microsoft Cognitive Services

Age 47
Gender Male

Microsoft Cognitive Services

Age 34
Gender Male

Microsoft Cognitive Services

Age 20
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Painting 96.6%

Categories

Imagga

paintings art 97.8%

Text analysis

Google

Step t
Step
t