Human Generated Data

Title

Interior of Busch-Reisinger Romanesque Hall

Date

1970

People

Artist: Barbara Westman, American

Classification

Drawings

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Purchase in memory of Eda K. Loeb, BR70.22

Human Generated Data

Title

Interior of Busch-Reisinger Romanesque Hall

People

Artist: Barbara Westman, American

Date

1970

Classification

Drawings

Machine Generated Data

Tags

Amazon
created on 2020-05-02

Human 98.2
Art 98.2
Drawing 98.2
Doodle 93.3
Person 85.9
Sketch 84
Painting 83.5
Person 60.5
Person 49.9
Person 43.2

Clarifai
created on 2020-05-02

illustration 99.4
architecture 99.2
group 99
building 98.7
people 98.4
house 98.2
print 97.9
city 97.2
art 97.2
chalk out 96.7
town 96.6
urban 93.6
man 93
travel 92.9
many 92.8
room 91.6
no person 91.3
street 91.2
home 91.2
vehicle 90.8

Imagga
created on 2020-05-02

drawing 100
representation 100
sketch 100
map 83.5
atlas 35.3
plan 34.1
geography 32.8
world 29.3
paper 28.3
globe 26.9
travel 26.8
antique 26.4
country 24.6
old 24.4
vintage 23.2
direction 22.9
city 22.5
navigation 22.2
route 21.5
nation 20.9
road 20.8
planet 20.8
continent 20.4
design 19.7
location 19.6
sepia 19.5
capital 19
guide 18.6
tourism 18.2
journey 17.9
boundary 17.8
navigate 17.7
tour 17.4
states 17.4
state 17.3
discovery 16.6
wallpaper 16.1
geographic 15.8
expedition 15.8
find 15.7
position 15.7
explore 14.7
gold 14
business 13.4
earth 12.8
global 12.8
architecture 12.5
symbol 12.1
art 11.7
dutch 11.7
retro 11.5
page 11.2
texture 11.1
grunge 11.1
graphic 11
pattern 11
blueprint 10.8
pencil 10.5
construction 10.3
cartography 9.8
architect 9.7
black 9.6
house 9.2
cartoon 8.9
print 8.5
china 8.5
sign 8.3
element 8.3
painting 8.1
history 8.1
topography 7.9
drafting 7.9
draft 7.9
ancient 7.8
line 7.7
project 7.7
north 7.7
path 7.6
floor 7.5
border 7.3
building 7.2

Google
created on 2020-05-02

Microsoft
created on 2020-05-02

text 99.9
sketch 99.7
drawing 99.6
map 99.1
art 90.9
illustration 89.4
cartoon 86.2

Face analysis

Amazon

AWS Rekognition

Age 20-32
Gender Male, 50.5%
Confused 50%
Surprised 49.9%
Happy 49.5%
Fear 49.5%
Sad 49.5%
Disgusted 49.5%
Calm 49.6%
Angry 49.5%

AWS Rekognition

Age 7-17
Gender Female, 50.2%
Angry 49.5%
Disgusted 49.5%
Happy 49.5%
Calm 49.5%
Confused 49.5%
Sad 50.5%
Fear 49.5%
Surprised 49.5%

AWS Rekognition

Age 10-20
Gender Female, 50.1%
Confused 49.5%
Happy 49.5%
Fear 50.4%
Surprised 49.5%
Angry 49.5%
Sad 49.5%
Disgusted 49.5%
Calm 49.5%

AWS Rekognition

Age 33-49
Gender Male, 50.4%
Calm 49.6%
Sad 49.5%
Fear 49.5%
Angry 50.3%
Surprised 49.5%
Happy 49.5%
Confused 49.5%
Disgusted 49.6%

AWS Rekognition

Age 19-31
Gender Female, 54%
Angry 47.5%
Confused 45.1%
Fear 45%
Calm 51.6%
Disgusted 45%
Sad 45.2%
Surprised 45%
Happy 45.5%

AWS Rekognition

Age 19-31
Gender Male, 50.2%
Sad 49.5%
Calm 50.5%
Fear 49.5%
Happy 49.5%
Angry 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%

AWS Rekognition

Age 36-54
Gender Male, 50.5%
Happy 49.5%
Sad 49.5%
Disgusted 49.6%
Calm 50.3%
Surprised 49.5%
Confused 49.5%
Angry 49.5%
Fear 49.5%

AWS Rekognition

Age 22-34
Gender Female, 51.8%
Confused 45.1%
Sad 48.2%
Disgusted 45%
Fear 46.5%
Surprised 45%
Happy 45.3%
Angry 46.4%
Calm 48.4%

AWS Rekognition

Age 23-35
Gender Male, 50.5%
Confused 49.5%
Calm 49.8%
Sad 49.5%
Disgusted 49.5%
Surprised 49.5%
Fear 49.6%
Angry 49.6%
Happy 49.9%

AWS Rekognition

Age 12-22
Gender Male, 50.3%
Surprised 49.5%
Sad 49.5%
Fear 50.4%
Disgusted 49.5%
Calm 49.5%
Angry 49.5%
Confused 49.5%
Happy 49.5%

AWS Rekognition

Age 3-11
Gender Male, 50.2%
Fear 49.5%
Happy 49.5%
Sad 49.8%
Angry 49.8%
Confused 49.7%
Disgusted 49.6%
Surprised 49.5%
Calm 49.6%

AWS Rekognition

Age 34-50
Gender Female, 50.4%
Calm 50%
Confused 49.5%
Angry 49.9%
Surprised 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 49.5%
Fear 49.5%

Feature analysis

Amazon

Person 85.9%
Painting 83.5%

Captions

Microsoft

a close up of a map 89.1%
a map with text 84.4%
close up of a map 84.3%

Text analysis

Google

brlae ikstm
brlae
ikstm