Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.57

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.57

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 99.2
Person 94.4
Cleaning 93.3
Outdoors 84.2
Washing 67.5
Garden 56.8

Clarifai
created on 2023-10-25

people 99.4
bucket 98.5
broom 96.1
one 94.2
adult 93.8
child 93.3
two 91.3
man 89.9
shovel 89.8
spade 89.7
rake 89
wear 88.3
retro 87.6
shovelful 87.4
family 86.8
wood 82.9
tool 80.7
three 80.5
container 78.1
farming 76.4

Imagga
created on 2022-01-09

cleaner 100
adult 20
person 17.9
vacuum 15.7
old 15.3
fashion 15.1
man 14.8
attractive 14.7
pretty 14
people 13.9
one 13.4
portrait 12.9
bucket 12.7
lady 12.2
smiling 10.8
tool 10.8
water 10.7
happy 10.6
sexy 10.4
body 10.4
posing 9.8
interior 9.7
indoors 9.7
looking 9.6
smile 9.3
male 9.2
house 9.2
travel 9.1
dress 9
home appliance 8.9
work 8.8
home 8.8
building 8.7
standing 8.7
lifestyle 8.7
hand 8.3
clean 8.3
vessel 8.3
container 7.9
black 7.8
model 7.8
crutch 7.7
wall 7.7
walking 7.6
human 7.5
style 7.4
sport 7.4
makeup 7.3
alone 7.3
hair 7.1
working 7.1
shovel 7.1
architecture 7

Microsoft
created on 2022-01-09

outdoor 89.8
waste container 87.6
man 86.9
person 85.4
clothing 84.1
boy 50.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 41-49
Gender Male, 100%
Confused 47.9%
Calm 42.3%
Surprised 3%
Fear 2.6%
Sad 2%
Angry 1.2%
Disgusted 0.6%
Happy 0.5%

Microsoft Cognitive Services

Age 42
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.4%

Categories