Human Generated Data

Title

Box

Date

18th century

People
Classification

Boxes

Credit Line

Harvard Art Museums/Fogg Museum, Bequest of Grenville L. Winthrop, 1943.1627

Human Generated Data

Title

Box

People
Date

18th century

Classification

Boxes

Machine Generated Data

Tags

Amazon
created on 2022-02-19

Human 99.3
Person 99.3
Art 92.3
Basket 85.3
Painting 78.6

Imagga
created on 2022-02-19

wicker 66.8
basket 54.9
container 49.6
product 34.7
work 34.2
creation 20.9
vessel 19.6
brown 17.7
food 15.3
rattan 15
hamper 14.7
wood 14.2
wooden 14.1
money 13.6
shopping basket 12.4
old 11.8
woven 11.8
pet 11
close 10.8
barrel 10.7
coins 10.6
bucket 10.5
handle 10.5
chocolate 10.5
storage 10.5
animal 10.2
finance 10.1
decoration 10.1
closeup 10.1
investment 10.1
domestic 9.9
wealth 9.9
gold 9.9
tin 9.7
little 9.7
metal 9.7
empty 9.6
switch 9.5
nobody 9.3
cash 9.1
mammal 9
object 8.8
natural 8.7
can 8.6
coin 8.6
cat 8.5
cake 8.5
fur 8.4
currency 8.1
bank 8.1
box 8
dessert 7.9
gift 7.7
bowl 7.7
snack 7.7
egg 7.6
easter 7.6
rich 7.4
furniture 7.3
business 7.3
instrument of punishment 7.2
spring 7.1

Google
created on 2022-02-19

Microsoft
created on 2022-02-19

painting 93.5
human face 93.1
art 92.4
person 90.9
museum 71.3
woman 69
clothing 52.1

Face analysis

Amazon

Microsoft

AWS Rekognition

Age 4-12
Gender Female, 100%
Calm 88.1%
Happy 7.6%
Confused 1.3%
Surprised 0.8%
Angry 0.8%
Sad 0.8%
Disgusted 0.3%
Fear 0.2%

Microsoft Cognitive Services

Age 19
Gender Female

Feature analysis

Amazon

Person 99.3%
Painting 78.6%

Captions

Microsoft

a close up of a bowl 57%
close up of a bowl 47.7%
a person sitting in a bowl 35.1%

Text analysis

Google

Savnile
Savnile