Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.75

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.75

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.1
Human 99.1
Person 98.9
Person 96.2
Person 96.2
Painting 86.4
Art 86.4
Face 84.4
People 84.2
Room 73.2
Indoors 73.2
Sitting 65.5
Portrait 64.9
Photography 64.9
Photo 64.9
Performer 58

Clarifai
created on 2023-10-25

people 100
group 99.8
adult 99.5
woman 99.3
portrait 98.9
man 98.6
three 98.5
two 98.1
offspring 97.9
child 97.7
group together 96.7
four 96.4
leader 95.9
five 95.7
wear 95.3
vehicle 95
sit 93.9
seat 93.9
veil 93.6
transportation system 93

Imagga
created on 2022-01-09

television 38.2
telecommunication system 27.6
man 27.5
people 27.3
person 23.8
sitting 20.6
adult 20.3
male 19.2
love 18.9
couple 18.3
lifestyle 18
attractive 16.8
chair 16.4
happy 16.3
room 15.3
professional 14.9
pretty 14.7
home 14.3
black 13.9
looking 13.6
fashion 13.6
business 13.3
happiness 13.3
portrait 12.9
men 12.9
sexy 12.8
couch 12.5
leisure 12.4
smiling 12.3
musical instrument 12.1
sofa 11.8
suit 11.7
model 11.7
cheerful 11.4
furniture 10.8
smile 10.7
office 10.6
businessman 10.6
lady 10.5
together 10.5
passion 10.3
laptop 10.1
indoor 10
house 10
dress 9.9
fun 9.7
thinking 9.5
relax 9.3
executive 9.2
romance 8.9
interior 8.8
women 8.7
waiting 8.7
human 8.2
one 8.2
sensual 8.2
style 8.2
worker 8.1
romantic 8
night 8
job 8
computer 7.9
guitar 7.8
reading 7.6
studio 7.6
hand 7.6
tie 7.6
wife 7.6
living 7.6
elegance 7.5
stringed instrument 7.4
sensuality 7.3
group 7.2
body 7.2
handsome 7.1
posing 7.1
working 7.1
work 7.1

Google
created on 2022-01-09

Rectangle 78.6
Art 78.5
Tints and shades 77.4
Vintage clothing 76.9
Motor vehicle 71.3
Event 68.5
Visual arts 67
Sitting 66.4
Classic 65.8
Oval 62.2
Room 60.5
Square 58.3
Painting 55.9
Still life photography 54.4
History 54.4
Monochrome 51.1
Chair 50.6
Antique 50.3

Microsoft
created on 2022-01-09

person 97.2
clothing 96.2
wall 95.6
old 90.1
man 89.3
text 87.4
posing 75.7
smile 72.5
vintage 29.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 14-22
Gender Female, 100%
Calm 99.3%
Confused 0.2%
Angry 0.2%
Happy 0.1%
Sad 0.1%
Disgusted 0.1%
Fear 0%
Surprised 0%

AWS Rekognition

Age 13-21
Gender Female, 100%
Calm 98.5%
Sad 1.1%
Confused 0.2%
Surprised 0.1%
Angry 0.1%
Disgusted 0%
Fear 0%
Happy 0%

AWS Rekognition

Age 11-19
Gender Female, 99.7%
Calm 91.8%
Confused 3.4%
Sad 2.9%
Surprised 0.6%
Angry 0.3%
Fear 0.3%
Happy 0.3%
Disgusted 0.3%

AWS Rekognition

Age 18-26
Gender Female, 99.9%
Calm 98%
Sad 0.6%
Happy 0.4%
Surprised 0.3%
Angry 0.2%
Confused 0.2%
Fear 0.1%
Disgusted 0.1%

Microsoft Cognitive Services

Age 28
Gender Female

Microsoft Cognitive Services

Age 31
Gender Female

Microsoft Cognitive Services

Age 20
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Possible
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Painting 86.4%

Categories