Human Generated Data

Title

Man at Window

Date

c. 1973

People

Artist: Jan Peter Tripp, German born 1945

Classification

Drawings

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of the Artist through the German Art Dealers Association, 1992.229

Human Generated Data

Title

Man at Window

People

Artist: Jan Peter Tripp, German born 1945

Date

c. 1973

Classification

Drawings

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of the Artist through the German Art Dealers Association, 1992.229

Machine Generated Data

Tags

Amazon
created on 2020-04-30

Art 98.8
Drawing 98.8
Human 98.8
Sketch 96.3
Person 91.2
Text 76
Painting 68.9

Clarifai
created on 2020-04-30

people 99.6
portrait 99.3
one 98.7
adult 98.3
woman 97.8
monochrome 95.7
man 95
wear 93.7
indoors 92
girl 90.7
art 88.9
room 88.6
window 87.4
street 85.3
winter 83.6
two 82.8
outerwear 82
bill 80.7
facial expression 79.3
paper 79.1

Imagga
created on 2020-04-30

sketch 52.1
drawing 39.3
money 39.1
currency 35
representation 33.7
cash 32.9
dollar 31.6
bank 29.8
banking 28.5
bill 25.7
paper 24.9
finance 22.8
close 22.3
wealth 21.5
financial 19.6
business 18.8
statue 18.4
dollars 18.3
face 17.8
franklin 17.7
pay 17.3
hundred 16.5
one 16.4
closeup 16.2
portrait 15.5
banknote 15.5
us 15.4
sculpture 15.3
savings 14.9
bills 14.6
exchange 14.3
rich 14
old 13.9
payment 13.5
head 13.4
loan 13.4
famous 13
president 11.8
religion 11.6
finances 11.6
capital 11.4
monument 11.2
commerce 11.2
note 11
banknotes 10.8
god 10.5
art 10.3
economy 10.2
church 10.2
man 10.1
hound 10
economic 9.7
states 9.7
corbel 9.7
price 9.6
stone 9.6
symbol 9.4
sign 9
history 8.9
market 8.9
funds 8.8
architecture 8.7
ancient 8.6
united 8.6
culture 8.5
religious 8.4
historic 8.2
detail 8
wages 7.8
value 7.8
bracket 7.7
sales 7.7
sale 7.4
investment 7.3
success 7.2

Google
created on 2020-04-30

Microsoft
created on 2020-04-30

sketch 99.8
drawing 99.7
text 95.5
art 93.6
painting 91.5
black and white 90.6
human face 88.3
child art 81.9
gallery 79.5
person 67
room 57.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 22-34
Gender Male, 70.7%
Sad 12.3%
Happy 0.2%
Angry 0.8%
Confused 0.2%
Fear 0.3%
Disgusted 0.1%
Calm 86%
Surprised 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 91.2%
Painting 68.9%

Categories

Imagga

pets animals 78.1%
paintings art 21.5%

Captions

Microsoft
created on 2020-04-30

an old photo of a person 70.2%
a photo of a person 67.3%
a close up of a person 63.7%

Text analysis

Amazon

/0192-2.29

Google

1992-229
1992-229