Human Generated Data

Title

Commerce

Date

1844-1857

People

Artist: Toppan Carpenter & Co., American

Classification

Prints

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Dorothy Rackemann, daughter of Francis Rackemann, class of 1909, M.D. 1912, M21626

Human Generated Data

Title

Commerce

People

Artist: Toppan Carpenter & Co., American

Date

1844-1857

Classification

Prints

Machine Generated Data

Tags

Amazon
created on 2022-03-12

Person 96.2
Human 96.2
Person 95
Clothing 90.3
Apparel 90.3
Helmet 88.7
Crash Helmet 88.7
Person 79
People 76.4
Photography 61
Photo 61
Astronaut 56.2

Imagga
created on 2022-03-12

sand 82.3
soil 39.5
beach 29.8
earth 28.9
desert 21.7
sea 20.5
wild 20
summer 19.9
travel 19.7
animal 19.7
wildlife 18.7
water 18
safari 17.3
landscape 17.1
crustacean 17
ocean 16.7
arthropod 16.3
gymnosperm 15.7
vacation 15.5
dune 14.1
tropical 13.6
mammal 13.5
sun 12.9
outdoors 12.7
coast 12.6
tourism 12.4
invertebrate 11.9
texture 11.8
spermatophyte 11.8
outdoor 11.5
natural 11.4
dry 11.1
sky 10.8
tarantula 10.6
shore 10.2
animals 10.2
wilderness 9.4
grunge 9.4
spider 9.3
crab 9.2
island 9.1
plant 9.1
park 9.1
game 9
brown 8.8
holiday 8.6
coastline 8.5
south 8.4
old 8.4
grain 8.3
arachnid 8.2
vascular plant 8
textured 7.9
antique 7.8
nobody 7.8
adventure 7.6
resort 7.5
vintage 7.4
rest 7.4
camel 7.3
national 7.2
aged 7.2
surface 7

Google
created on 2022-03-12

Microsoft
created on 2022-03-12

laying 80.5
drawing 78.7
black 76.3
text 74.8
sketch 59.1
vehicle 58.7
envelope 29

Face analysis

Amazon

Google

AWS Rekognition

Age 6-14
Gender Male, 94.2%
Surprised 39.2%
Calm 26.5%
Disgusted 14.1%
Fear 6.5%
Confused 6.2%
Angry 2.9%
Sad 2.5%
Happy 2.2%

AWS Rekognition

Age 6-16
Gender Male, 96.8%
Calm 97.6%
Sad 1.1%
Confused 0.4%
Fear 0.3%
Angry 0.2%
Surprised 0.2%
Happy 0.1%
Disgusted 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 96.2%

Captions

Microsoft

a person lying on a bed 30.7%
a person lying on the ground 30.6%
a person lying on the floor 30.5%