Human Generated Data

Title

Queen

Date

1975, printed 1984

People

Artist: Audrey Flack, American 1931 -

Printer: Guy Stricherz,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.177.1

Human Generated Data

Title

Queen

People

Artist: Audrey Flack, American 1931 -

Printer: Guy Stricherz,

Date

1975, printed 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Sidney and Shirley Singer, 2013.177.1

Machine Generated Data

Tags

Amazon
created on 2019-04-08

Plant 98.2
Food 89
Egg 89
Wristwatch 85.6
Human 79.2
Blossom 74.6
Flower 74.6
Fruit 73.7
Person 71.5
Person 70.9
Meal 70
Burger 63.3
Flower Arrangement 62.6
Citrus Fruit 59.9
Flower Bouquet 59.6
Drink 56.9
Beverage 56.9
Bottle 55.4

Clarifai
created on 2018-02-09

drink 97.6
no person 95.6
alcohol 95.2
wine 94.9
glass 94.5
food 94.3
fruit 90.2
desktop 90.2
tea 89.8
chocolate 88.9
table 88.1
celebration 87.8
cup 87.6
Christmas 87.5
sweet 87.2
party 86.7
decoration 86
bar 85.5
juice 85.2
delicious 83.2

Imagga
created on 2018-02-09

perfume 96.4
toiletry 80.3
fruit 40.3
food 28.5
citrus 24.5
fresh 24.2
orange 23.4
fruits 19.8
diet 19.6
healthy 19.5
breakfast 18.1
vitamin 17.8
apple 17.1
nutrition 16.8
celebration 16.7
juice 16.6
gold 16.4
health 16
glass 15.6
meal 15.5
tasty 15
sweet 15
yellow 14.6
decoration 13.7
drink 13.4
freshness 13.3
juicy 12.7
natural 12.7
box 12.6
holiday 12.2
organic 11.8
traditional 11.6
lemon 11.3
gift 11.2
eating 10.9
close 10.8
cut 10.8
dessert 10.6
color 10.6
golden 10.3
object 10.3
tropical 10.2
beverage 10.2
slice 10
ripe 10
delicious 9.9
oranges 9.8
refreshment 9.7
shiny 9.5
ornament 9.5
party 9.5
dieting 9.4
winter 9.4
gourmet 9.3
ribbon 9.3
candle 9
vegetarian 8.9
kiwi 8.9
decorate 8.6
vitamins 8.6
snack 8.5
festive 8.3
pear 8.1
celebrate 8.1
symbol 8.1
raw 8
market 8
mandarin 7.8
container 7.8
money 7.7
eat 7.5
decorative 7.5
muskmelon 7.5
tradition 7.4
basket 7.3
year 7.3
present 7.3
tangerine 7.2
currency 7.2
ingredient 7

Google
created on 2018-02-09

product 81.7
gift 62.8
food 62.5
coffee cup 56.5
still life 55

Microsoft
created on 2018-02-09

indoor 89

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 49-69
Gender Female, 97.7%
Surprised 1.4%
Angry 1.3%
Disgusted 1.1%
Confused 3%
Happy 86.4%
Calm 2.8%
Sad 4.1%

AWS Rekognition

Age 26-43
Gender Female, 97.9%
Disgusted 2.2%
Confused 1.1%
Happy 91.5%
Angry 1.7%
Calm 0.9%
Surprised 1.5%
Sad 1.3%

AWS Rekognition

Age 15-25
Gender Female, 97.3%
Disgusted 1.7%
Happy 6.6%
Surprised 9.6%
Angry 5.2%
Calm 46.9%
Sad 16.1%
Confused 14%

Microsoft Cognitive Services

Age 60
Gender Female

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 30
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Egg 89%
Wristwatch 85.6%
Person 71.5%
Burger 63.3%

Categories

Captions

Microsoft
created on 2018-02-09

food on a table 80.7%
a close up of food on a table 80.6%
a bowl of food on a table 78.6%

Text analysis

Amazon

fe
y
ClL Coednldl fe
2
Coednldl
ClL