Human Generated Data

Title

Untitled (Friesland, Marken; verso: Zeeland, Zeeland)

Date

1890s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.5.2

Human Generated Data

Title

Untitled (Friesland, Marken; verso: Zeeland, Zeeland)

People

Artist: Unidentified Artist,

Date

1890s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Janet and Daniel Tassel, 2007.219.5.2

Machine Generated Data

Tags

Amazon
created on 2019-11-05

Human 98.7
Person 98.7
Person 97.2
Food 96.6
Confectionery 96.6
Sweets 96.6
Person 95.1
Person 93.4
Person 92.8
Cream 91.7
Icing 91.7
Dessert 91.7
Cake 91.7
Creme 91.7
Person 90.1
Painting 69.4
Art 69.4
Apparel 69.2
Clothing 69.2
Outdoors 66.7
Nature 57.8
Chocolate 55.6

Clarifai
created on 2019-11-05

box 96.8
cardboard 92.5
child 92.5
no person 92.4
paper 91.3
love 91.3
woman 91.2
art 90.9
two 89.2
retro 89
container 87.3
people 86.7
travel 86.3
wear 85.9
illustration 83.6
summer 83.2
painting 82.7
card 82.6
shopping 82
post 81.7

Imagga
created on 2019-11-05

envelope 75.8
container 57.2
paper 43.1
box 27
package 25.2
gift 23.2
old 23
stamp 21.5
vintage 20.7
decoration 20.7
blank 20.6
bookmark 20.6
holiday 18.6
ribbon 17.6
present 17.3
page 16.7
antique 16.4
card 16.3
grunge 16.2
celebration 15.2
birthday 14.8
retro 14.8
note 14.7
shopping 14.7
bow 14.2
object 13.9
design 13.3
gold 13.2
ancient 13
packet 12.9
money 12.8
yellow 12.6
day 12.6
season 12.5
carton 12.4
greeting 12.1
empty 12
die 11.2
texture 11.1
letter 11
valentine 10.9
dirty 10.8
packaging 10.7
gifts 10.7
book 10.6
wrapping 10.6
surprise 10.4
brown 10.3
aged 10
currency 9.9
copy 9.7
business 9.7
book jacket 9.7
shiny 9.5
ornament 9.5
symbol 9.4
frame 9.2
art 9.1
pillow 9
queen 8.8
mail 8.6
golden 8.6
post 8.6
close 8.6
shaping tool 8.5
anniversary 8.5
sign 8.3
single 8.2
backgrounds 8.1
bank 8.1
board 7.9
knot 7.8
king 7.8
torn 7.7
flower 7.7
floral 7.7
finance 7.6
jacket 7.5
pattern 7.5
notebook 7.4
ornate 7.3
collection 7.2

Google
created on 2019-11-05

Microsoft
created on 2019-11-05

painting 92.9
clothing 89.7
person 83.3
text 83.2
cartoon 74.6
drawing 63.6
flower 54.1
different 41.3
several 13.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-26
Gender Male, 54.6%
Sad 45%
Calm 55%
Surprised 45%
Angry 45%
Fear 45%
Happy 45%
Confused 45%
Disgusted 45%

AWS Rekognition

Age 25-39
Gender Male, 54.1%
Disgusted 45%
Happy 45%
Confused 45%
Sad 45%
Calm 54.6%
Angry 45.3%
Surprised 45%
Fear 45%

AWS Rekognition

Age 47-65
Gender Male, 53.4%
Angry 45%
Happy 45%
Surprised 45%
Sad 45%
Calm 55%
Confused 45%
Fear 45%
Disgusted 45%

AWS Rekognition

Age 51-69
Gender Female, 50.1%
Fear 45.1%
Calm 50.1%
Sad 47.8%
Confused 45.7%
Angry 45.5%
Happy 45.5%
Surprised 45.2%
Disgusted 45.1%

AWS Rekognition

Age 40-58
Gender Female, 53.7%
Fear 45%
Happy 45%
Sad 49.5%
Disgusted 45%
Calm 48.8%
Surprised 45%
Confused 45.3%
Angry 46.3%

AWS Rekognition

Age 3-11
Gender Female, 54.5%
Happy 45%
Disgusted 45.1%
Fear 45%
Calm 51.3%
Angry 48%
Sad 45.4%
Confused 45.2%
Surprised 45%

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Painting 69.4%

Categories

Imagga

paintings art 99%

Captions

Microsoft
created on 2019-11-05

a close up of a box 55.6%
close up of a box 49.5%
a close up of a box on a table 33.6%

Text analysis

Amazon

ha/teu
esleol ha/teu
esleol
592

Google

592 Yuesleaal
592
Yuesleaal