Human Generated Data

Title

Untitled (group of four women, one with string instrument)

Date

c. 1870-c. 1890

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.367.44

Human Generated Data

Title

Untitled (group of four women, one with string instrument)

People

Artist: Unidentified Artist,

Date

c. 1870-c. 1890

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Purchase through the generosity of Melvin R. Seiden, P1982.367.44

Machine Generated Data

Tags

Amazon
created on 2019-04-07

Human 98.4
Person 98.4
Person 98.3
Person 87.2
Clothing 86.6
Apparel 86.6
Person 82.3
Art 78.6
Advertisement 77.2
Person 75.6
Person 70.8
Painting 69
Poster 68.2
Person 67
Archaeology 66.5
Text 64.3
Collage 61.1

Clarifai
created on 2018-03-16

print 99.5
illustration 99.4
people 99
no person 98.4
group 98.1
lithograph 97.9
art 97.4
collection 96.6
many 96.3
adult 95.3
furniture 94.8
painting 94.3
room 93
interior design 92.7
woman 92.3
man 91.9
indoors 88.8
container 88
shelf 87.3
one 86.1

Imagga
created on 2018-03-16

insulating material 48.4
building material 36.3
case 30.1
envelope 28.6
old 25.8
paper 25.1
vintage 24
retro 19.7
money 19.6
currency 18.8
finance 17.7
container 16.8
bank 16.1
cash 14.6
collection 14.4
mail 14.4
stamp 14.1
design 14.1
antique 14
postage 13.8
bill 13.3
texture 13.2
book 13.1
flower 13.1
letter 12.8
art 12.6
frame 12.5
business 12.1
ancient 12.1
banking 11.9
grunge 11.9
financial 11.6
comic book 11.3
pattern 10.9
aged 10.9
postmark 10.8
wealth 10.8
banknote 10.7
exchange 10.5
blank 10.3
product 9.4
floral 9.4
card 9.3
bookmark 9.3
page 9.3
note 9.2
border 9
dirty 9
office 8.8
symbol 8.8
post 8.6
unique 8.5
artwork 8.2
postal 7.8
bills 7.8
drawing 7.7
payment 7.7
stained 7.7
pay 7.7
savings 7.5
dollar 7.4
investment 7.3
decoration 7.3
color 7.2

Google
created on 2018-03-16

Microsoft
created on 2018-03-16

different 96.5
various 69.9
bunch 40.1
several 12.8

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 50.1%
Disgusted 49.9%
Happy 49.5%
Angry 49.7%
Surprised 49.5%
Sad 49.7%
Calm 49.6%
Confused 49.6%

AWS Rekognition

Age 4-9
Gender Female, 50.5%
Sad 50.3%
Surprised 49.5%
Disgusted 49.6%
Confused 49.5%
Calm 49.5%
Happy 49.5%
Angry 49.6%

AWS Rekognition

Age 16-27
Gender Female, 50.3%
Happy 49.5%
Disgusted 49.5%
Angry 49.6%
Surprised 49.5%
Sad 49.8%
Calm 50%
Confused 49.5%

AWS Rekognition

Age 29-45
Gender Female, 50.5%
Disgusted 49.5%
Calm 49.5%
Confused 49.5%
Surprised 49.5%
Sad 49.7%
Happy 50.1%
Angry 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Calm 49.5%
Angry 49.6%
Happy 49.6%
Disgusted 49.5%
Sad 50.3%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 12-22
Gender Female, 54.3%
Disgusted 45.6%
Surprised 46.2%
Sad 47.2%
Calm 47.7%
Angry 46.3%
Happy 45.9%
Confused 46.2%

AWS Rekognition

Age 27-44
Gender Female, 50.1%
Disgusted 49.5%
Calm 49.7%
Confused 49.5%
Surprised 49.6%
Sad 50.1%
Happy 49.5%
Angry 49.6%

AWS Rekognition

Age 26-43
Gender Male, 50.2%
Sad 49.5%
Calm 49.6%
Surprised 49.5%
Angry 49.5%
Disgusted 50.4%
Happy 49.5%
Confused 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Calm 49.5%
Disgusted 49.6%
Angry 49.5%
Happy 49.7%
Confused 49.5%
Sad 50.1%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.2%
Disgusted 49.7%
Sad 49.6%
Calm 49.8%
Surprised 49.6%
Happy 49.6%
Confused 49.5%
Angry 49.6%

AWS Rekognition

Age 15-25
Gender Female, 50.5%
Surprised 49.5%
Disgusted 49.8%
Calm 49.5%
Confused 49.6%
Angry 49.5%
Happy 49.6%
Sad 50%

Feature analysis

Amazon

Person 98.4%
Painting 69%

Categories

Imagga

paintings art 100%

Captions

Text analysis

Amazon

P3AlE
7AhNE
sat