Human Generated Data

Title

Untitled (woman at a spinning wheel)

Date

c. 1870-c. 1890

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.367.34

Human Generated Data

Title

Untitled (woman at a spinning wheel)

People

Artist: Unidentified Artist,

Date

c. 1870-c. 1890

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.367.34

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 95.8
Person 95.8
Person 92.6
Poster 90
Collage 90
Advertisement 90
Person 87.4
Art 83.7
Person 81.1
Painting 80.1
Person 78.6
Apparel 75.5
Clothing 75.5
Person 66.1
Text 62.3
Person 62.3
Vehicle 57.4
Transportation 57.4
Figurine 57.3
Machine 56.3
Spoke 56.3

Clarifai
created on 2018-03-16

illustration 99.8
people 99.7
print 99.6
painting 99.5
art 99.4
group 99.2
adult 98.4
woman 98.3
lithograph 98.1
furniture 97.3
man 95.1
many 94.8
no person 94.2
room 94
collection 93.8
veil 93.2
gown (clothing) 92.7
wear 92.1
seat 91.5
indoors 90.8

Imagga
created on 2018-03-16

case 91.7
shelf 25.9
vintage 24
old 21.6
paper 18.8
antique 17.3
retro 15.5
money 14.4
art 14.4
currency 14.3
stamp 14
postage 13.7
letter 13.7
collection 13.5
mail 13.4
business 13.3
post 13.3
cash 12.8
bank 12.5
frame 12.5
design 12.4
bill 12.4
finance 11.8
postal 10.8
religion 10.7
texture 10.4
ancient 10.4
culture 10.2
envelope 10.2
colorful 10
flower 10
postmark 9.8
decoration 9.6
pattern 9.6
exchange 9.5
church 9.2
banking 9.2
note 9.2
office 8.8
payment 8.6
temple 8.5
unique 8.5
travel 8.4
aged 8.1
wealth 8.1
history 8
financial 8
close 8
home 8
box 7.9
banknote 7.8
stained 7.7
bookmark 7.5
china 7.5
savings 7.4
symbol 7.4
global 7.3
painting 7.2
black 7.2

Google
created on 2018-03-16

picture frame 77.6
collection 71.9
art 70.7

Microsoft
created on 2018-03-16

gallery 96.9
room 96
scene 87.4
different 84.4
various 54.4
bunch 54
several 20.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Male, 52.7%
Sad 46.7%
Surprised 45.1%
Confused 45.2%
Happy 45.1%
Calm 52.7%
Disgusted 45%
Angry 45.2%

AWS Rekognition

Age 26-43
Gender Female, 54.9%
Surprised 45.7%
Confused 45.5%
Calm 45.5%
Sad 52.4%
Happy 45.2%
Disgusted 45.2%
Angry 45.5%

AWS Rekognition

Age 20-38
Gender Female, 54.6%
Angry 45.8%
Disgusted 45.4%
Calm 48.1%
Happy 45.8%
Sad 48.5%
Surprised 46.1%
Confused 45.3%

AWS Rekognition

Age 27-44
Gender Female, 54.2%
Sad 45.3%
Surprised 45.4%
Calm 49.4%
Happy 45.9%
Angry 45.3%
Disgusted 48.6%
Confused 45.1%

AWS Rekognition

Age 4-9
Gender Female, 50.5%
Confused 49.5%
Surprised 49.5%
Happy 49.6%
Angry 50%
Disgusted 49.5%
Sad 49.8%
Calm 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.1%
Sad 49.6%
Disgusted 49.8%
Confused 49.7%
Surprised 49.6%
Calm 49.8%
Angry 49.6%
Happy 49.5%

AWS Rekognition

Age 23-38
Gender Female, 50.4%
Angry 49.9%
Disgusted 49.6%
Calm 49.6%
Happy 49.5%
Sad 49.7%
Surprised 49.6%
Confused 49.6%

Feature analysis

Amazon

Person 95.8%
Painting 80.1%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

at3T