Human Generated Data

Title

Untitled (meal scene with three figures)

Date

c. 1870-c. 1890

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.367.17

Human Generated Data

Title

Untitled (meal scene with three figures)

People

Artist: Unidentified Artist,

Date

c. 1870-c. 1890

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.367.17

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Person 98.4
Human 98.4
Person 98.4
Person 97.2
Art 93.8
Clothing 92.5
Apparel 92.5
Painting 91.8

Clarifai
created on 2018-03-16

art 99.3
illustration 99.2
people 99.1
painting 99
print 98.8
group 97
card 95.9
adult 95.8
exhibition 95.7
picture frame 95.2
antique 93.5
museum 93.4
vintage 92.4
lithograph 92.4
wear 92.4
man 92.2
old 92.2
one 92.1
furniture 92
woman 91.5

Imagga
created on 2018-03-16

tray 100
receptacle 98.3
container 81.5
old 39
vintage 38
frame 30.8
retro 27.9
chalkboard 27.5
grunge 26.4
texture 25
blackboard 23.9
chalk 22.4
blank 22.3
board 21.3
wall 19.7
black 19.2
antique 17.3
paper 17.2
art 17
business 17
design 16.9
education 16.5
school 16.2
pattern 15
rusty 14.3
ancient 13.8
message 13.7
aged 13.6
finance 13.5
symbol 13.5
learn 13.2
wooden 13.2
envelope 12.9
note 11.9
money 11.9
ornate 11.9
border 11.8
communication 11.8
classroom 11.7
empty 11.2
letter 11
decoration 11
concrete 10.5
icon 10.3
study 10.3
nobody 10.1
wood 10
dirty 9.9
gold 9.9
space 9.3
dollar 9.3
currency 9
backgrounds 8.9
idea 8.9
style 8.9
postage 8.8
object 8.8
notice 8.7
announcement 8.7
stamp 8.7
invitation 8.7
mail 8.6
floral 8.5
card 8.5
word 8.5
wallpaper 8.4
sign 8.3
close 8
text 7.9
binding 7.8
drawing 7.7
us 7.7
post 7.6
head 7.6
horizontal 7.5
write 7.5
page 7.4
greeting 7.4
cash 7.3
global 7.3
rough 7.3
metal 7.2

Google
created on 2018-03-16

picture frame 84.1
art 58.9
painting 51.3

Microsoft
created on 2018-03-16

gallery 52.3
room 43.4
picture frame 22.6

Color Analysis

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 20-38
Gender Female, 53.7%
Angry 46.2%
Sad 52.5%
Calm 45.6%
Disgusted 45.2%
Happy 45.2%
Surprised 45.1%
Confused 45.3%

AWS Rekognition

Age 17-27
Gender Female, 54.7%
Disgusted 45%
Angry 45.5%
Calm 45.2%
Sad 53.9%
Happy 45.1%
Confused 45.1%
Surprised 45.1%

AWS Rekognition

Age 26-43
Gender Female, 54.1%
Sad 45.6%
Surprised 46.4%
Calm 49.9%
Happy 46.4%
Angry 45.4%
Disgusted 46.1%
Confused 45.2%

Microsoft Cognitive Services

Age 34
Gender Female

Microsoft Cognitive Services

Age 16
Gender Female

Feature analysis

Amazon

Person 98.4%
Painting 91.8%

Categories

Imagga

paintings art 99.9%

Captions

Microsoft
created on 2018-03-16

a person sitting in a box 28.8%
a person sitting in a box 28.7%
a close up of a box 28.6%

Text analysis

Amazon

haif