Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.66

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.66

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Person 99.6
Human 99.6
Person 99.6
Person 97.9
Leisure Activities 88.5
Musician 84.4
Musical Instrument 84.4
Fiddle 75
Violin 75
Viola 75
Person 72.4
Photo 61.8
Photography 61.8
Portrait 60.5
Face 60.5

Clarifai
created on 2023-10-25

people 100
adult 99.3
group 99.3
two 98.4
leader 98.3
wear 97.5
man 97.5
group together 97.5
portrait 97.3
three 96.9
outfit 93.7
lid 92.1
uniform 91.7
art 91.7
four 90.7
veil 90.3
military uniform 90.2
military 90.2
child 89.2
several 89.1

Imagga
created on 2022-01-09

cash 20.1
art 20.1
graffito 19.8
currency 19.7
money 19.5
decoration 19.5
book jacket 15.6
old 15.3
bank 15.2
black 15.1
dollar 14.8
sculpture 14.6
finance 14.3
savings 14
financial 13.4
bill 13.3
jacket 13.1
banking 12.9
china 12.2
wealth 11.7
dollars 11.6
covering 11.5
paper 11.1
religion 10.7
vintage 10.7
business 10.3
ceramic ware 10.2
container 10.2
symbol 10.1
one 9.7
pay 9.6
economy 9.3
wrapping 9.2
note 9.2
history 8.9
object 8.8
man 8.7
banknote 8.7
newspaper 8.7
antique 8.6
culture 8.5
grunge 8.5
rich 8.4
church 8.3
letter 8.2
retro 8.2
detail 8
person 7.9
funds 7.8
porcelain 7.8
color 7.8
states 7.7
product 7.7
payment 7.7
united 7.6
design 7.6
creation 7.2
freight car 7.2
world 7.1
night 7.1

Google
created on 2022-01-09

Microsoft
created on 2022-01-09

old 99.7
person 99.2
text 94.1
vintage 90.5
man 86.5
clothing 82.2
black 79
white 78.4
posing 47.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 13-21
Gender Female, 100%
Confused 84.3%
Calm 8%
Surprised 2.9%
Angry 1.7%
Disgusted 1.3%
Fear 1%
Sad 0.6%
Happy 0.1%

AWS Rekognition

Age 19-27
Gender Male, 99.9%
Confused 33.8%
Calm 22%
Surprised 19.5%
Fear 15.1%
Sad 4.4%
Happy 2.1%
Angry 2%
Disgusted 1%

AWS Rekognition

Age 23-33
Gender Female, 85.4%
Fear 73.1%
Calm 11.9%
Sad 7.8%
Angry 2.1%
Happy 1.7%
Surprised 1.3%
Confused 1.3%
Disgusted 0.8%

AWS Rekognition

Age 26-36
Gender Female, 99.8%
Calm 99.6%
Happy 0.2%
Surprised 0.1%
Sad 0.1%
Confused 0%
Angry 0%
Disgusted 0%
Fear 0%

Microsoft Cognitive Services

Age 31
Gender Male

Microsoft Cognitive Services

Age 30
Gender Male

Microsoft Cognitive Services

Age 21
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories

Imagga

people portraits 65.7%
paintings art 32.5%

Text analysis

Google

-
-