Human Generated Data

Title

Figures in a Landscape

Date

c. 1935

People

Artist: Karl Hofer, German 1878 - 1955

Classification

Paintings

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of G. David Thompson in memory of Curt Valentin, BR56.231

Human Generated Data

Title

Figures in a Landscape

People

Artist: Karl Hofer, German 1878 - 1955

Date

c. 1935

Classification

Paintings

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of G. David Thompson in memory of Curt Valentin, BR56.231

Machine Generated Data

Tags

Amazon
created on 2020-04-24

Human 99.1
Person 99.1
Person 98.8
Person 97.7
Person 96.6
Art 94.9
Painting 93.5
Leisure Activities 93.2
Guitar 91.4
Musical Instrument 91.4
Person 90.9
Musician 77.2
Person 67.5
Performer 67.4
Guitarist 57

Clarifai
created on 2020-04-24

people 100
group 99.9
adult 99.7
group together 99.3
many 98.3
man 97.7
music 96.9
several 96.7
woman 96.3
child 95.5
print 95.5
two 94.6
musician 94.1
five 93.3
guitar 92.9
art 92.4
four 90.9
illustration 90.8
stringed instrument 90.2
three 90

Imagga
created on 2020-04-24

harp 81.6
stringed instrument 34.1
musical instrument 22.7
support 20.2
person 19.2
device 17.7
portrait 16.8
dress 16.3
adult 15.5
sexy 15.3
park 14.6
child 14.5
attractive 14
pretty 13.3
model 13.2
fashion 12.8
bridge 12.7
people 11.7
suspension bridge 11.6
sculpture 11.5
natural 11.4
happy 11.3
stone 11
traditional 10.8
face 10.7
body 10.4
black 10.2
smile 10
lady 9.7
culture 9.4
travel 9.1
city 9.1
mother 9.1
old 9.1
statue 8.7
architecture 8.6
animal 8.6
outdoors 8.5
elegance 8.4
structure 8.3
vacation 8.2
man 8.1
art 8
women 7.9
cute 7.9
wall 7.9
love 7.9
parent 7.6
dark 7.5
tourism 7.4
exotic 7.3
sensuality 7.3
gorgeous 7.2
lifestyle 7.2
male 7.2
tree 7.2
grass 7.1
summer 7.1
day 7.1
happiness 7

Google
created on 2020-04-24

Microsoft
created on 2020-04-24

outdoor 99
person 98.2
text 95.2
drawing 95
painting 89.5
man 87.9
clothing 83.5
sketch 79
cartoon 76.1
posing 68
old 67.5
people 55.1
family 17.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 14-26
Gender Female, 50.2%
Happy 45.1%
Disgusted 45%
Angry 45%
Calm 45.9%
Sad 53.9%
Fear 45%
Confused 45%
Surprised 45%

AWS Rekognition

Age 11-21
Gender Female, 54.9%
Calm 49.1%
Disgusted 45%
Confused 45.2%
Surprised 45%
Fear 45.1%
Happy 45%
Sad 49.6%
Angry 45.9%

AWS Rekognition

Age 13-25
Gender Female, 52.3%
Fear 45%
Angry 45.1%
Surprised 45%
Disgusted 45%
Calm 52.7%
Sad 47%
Confused 45%
Happy 45.1%

AWS Rekognition

Age 22-34
Gender Male, 52%
Angry 45%
Happy 45.1%
Calm 46.9%
Confused 45%
Fear 45.1%
Sad 52.9%
Surprised 45%
Disgusted 45%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Painting 93.5%

Categories

Imagga

people portraits 65.4%
paintings art 33.5%