Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.60

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.60

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.2
Human 99.2
Person 98.1
Person 94.8
Person 92.9
Musician 92.5
Musical Instrument 92.5
Person 92.3
Person 91.8
Person 91.1
Lute 90.8
Person 74.2
Leisure Activities 70.1
Guitar 58.5
Gong 58.4
Banjo 56.7

Clarifai
created on 2023-10-25

people 100
group 99.2
adult 98.7
group together 98.4
several 98.3
many 97.7
three 97.6
man 97.5
lid 96.8
veil 96.4
woman 96
five 95.8
two 95.8
four 94.7
vehicle 94.7
one 91.2
wear 89.8
leader 89
retro 84.2
child 83.8

Imagga
created on 2022-01-09

electric fan 39.8
ball 37.6
device 35.5
fan 31.5
equipment 23.6
percussion instrument 23
gong 20.1
basketball 18.1
musical instrument 17.4
game equipment 16.2
round 14.7
food 11.5
fruit 11.4
sports equipment 11.3
decoration 11.3
brown 11
game 10.7
soccer ball 10.5
celebration 10.4
sport 10.1
healthy 10.1
holiday 10
basketball equipment 9.8
close 9.7
basket 9.6
balls 9.6
glass 9.5
sphere 9.2
wicker 9.1
ornament 8.6
season 8.6
money 8.5
tree 8.5
winery 8.4
field 8.4
color 8.3
gold 8.2
symbol 8.1
object 8.1
drum 8.1
natural 8
agriculture 7.9
seasonal 7.9
play 7.7
shield 7.7
straw 7.7
football 7.7
old 7.7
winter 7.7
vintage 7.6
ventilator 7.6
hat 7.5
crop 7.5
leisure 7.5
closeup 7.4
stack 7.4
fresh 7.2
currency 7.2
shiny 7.1
summer 7.1
container 7

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

animal 87.9
person 86.5
clothing 79.8
man 68.7
text 61.7
old 61.6
drawing 55.7
image 30.9
stone 12.5

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 18-26
Gender Male, 69.4%
Calm 88.6%
Sad 10%
Angry 0.6%
Confused 0.3%
Fear 0.3%
Happy 0.1%
Surprised 0.1%
Disgusted 0.1%

AWS Rekognition

Age 19-27
Gender Male, 99.6%
Calm 98.8%
Sad 0.4%
Fear 0.3%
Angry 0.2%
Confused 0.1%
Happy 0.1%
Disgusted 0.1%
Surprised 0%

AWS Rekognition

Age 18-26
Gender Male, 98.8%
Calm 87.7%
Fear 4.2%
Sad 3.7%
Confused 2.9%
Surprised 0.5%
Happy 0.4%
Disgusted 0.4%
Angry 0.3%

AWS Rekognition

Age 25-35
Gender Male, 100%
Calm 99.1%
Sad 0.4%
Confused 0.2%
Fear 0.1%
Happy 0.1%
Angry 0%
Surprised 0%
Disgusted 0%

AWS Rekognition

Age 23-31
Gender Female, 93.1%
Calm 97.3%
Sad 2.4%
Fear 0.1%
Angry 0.1%
Confused 0%
Happy 0%
Disgusted 0%
Surprised 0%

AWS Rekognition

Age 19-27
Gender Male, 97.4%
Calm 87.3%
Sad 4.4%
Angry 3.4%
Surprised 1.9%
Fear 1.3%
Confused 1%
Happy 0.6%
Disgusted 0.2%

AWS Rekognition

Age 22-30
Gender Male, 99.9%
Calm 45.3%
Confused 20.5%
Surprised 13%
Happy 5.9%
Disgusted 4.9%
Fear 3.7%
Angry 3.5%
Sad 3.1%

Microsoft Cognitive Services

Age 32
Gender Male

Microsoft Cognitive Services

Age 39
Gender Male

Microsoft Cognitive Services

Age 33
Gender Male

Microsoft Cognitive Services

Age 28
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 23
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Likely
Blurred Unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Categories

Captions