Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

People

-

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.23

Human Generated Data

Title

Photo Album

Date

c. 1857 - c. 1874

Classification

Photographs

Credit Line

Harvard Art Museums/Arthur M. Sackler Museum, Transfer from Widener Library, Harvard University, 1978.484.23

Machine Generated Data

Tags

Amazon
created on 2022-01-09

Human 98.6
Person 98.6
Person 98.1
Person 97.8
Painting 97.2
Art 97.2
Person 96.1
Person 95.7
Person 95.6
Nature 94.1
Person 93.3
Outdoors 91
Person 87.3
Person 78
Land 72.4
Water 67
People 61.9
Tree 60.5
Plant 60.5
Market 55.2

Clarifai
created on 2023-10-25

people 100
group 99.8
many 99.4
adult 98.2
group together 96.9
man 95.3
vehicle 95
child 93.9
woman 93.3
several 92.6
home 92
art 91.3
cavalry 90.7
soldier 90.6
crowd 90.3
monochrome 89.9
interaction 87.7
street 86.6
print 84.4
wear 83.6

Imagga
created on 2022-01-09

cemetery 89.1
tree 31.1
landscape 26.8
structure 24.2
trees 23.1
snow 21.5
sky 21.1
old 20.2
winter 19.6
architecture 18.8
travel 18.3
park 17.8
building 17.3
water 16
city 15.8
forest 15.7
scenery 15.3
river 15.1
fountain 15
stone 14.9
fall 14.5
autumn 14.1
fog 13.5
outdoor 13
season 12.5
scenic 12.3
ancient 12.1
natural 12
history 11.6
tourism 11.6
house 11.4
spring 11
morning 10.9
light 10.7
sun 10.5
scene 10.4
cold 10.3
town 10.2
lake 10.1
field 10
yellow 9.9
environment 9.9
outdoors 9.8
country 9.7
grass 9.5
buildings 9.5
mountains 9.3
countryside 9.1
black 9
plant 8.9
rural 8.8
mist 8.7
woods 8.6
old fashioned 8.6
outside 8.6
sunrise 8.4
vintage 8.3
branch 8.2
landmark 8.1
night 8
bridge 7.9
seasonal 7.9
culture 7.7
palace 7.7
gravestone 7.7
church 7.4
foliage 7.4
historic 7.3
memorial 7.1
sunlight 7.1
mountain 7.1
day 7.1
weather 7.1
leaf 7

Google
created on 2022-01-09

Botany 87.9
Tree 86
Adaptation 79.3
Motor vehicle 75.3
Tints and shades 74.9
Art 74.5
Holy places 71.4
Building 69.2
Wood 68.2
Vintage clothing 67.9
History 67.9
Paper product 66.5
Water 66.1
Event 63.8
Visual arts 63.1
Stock photography 61.9
Antique 61.7
Plant 60.4
Classic 60.3
Painting 56.3

Microsoft
created on 2022-01-09

old 94
tree 88.1
grave 71
cemetery 68.6
text 64.2
group 56.5
vintage 35.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 34-42
Gender Male, 64%
Calm 67.2%
Sad 14.4%
Happy 9.4%
Fear 3.5%
Angry 1.7%
Disgusted 1.5%
Confused 1.2%
Surprised 1%

AWS Rekognition

Age 9-17
Gender Male, 98.9%
Disgusted 64%
Confused 11.5%
Sad 6.7%
Calm 6.2%
Happy 4.2%
Surprised 3.7%
Angry 2.3%
Fear 1.3%

AWS Rekognition

Age 34-42
Gender Female, 97.9%
Fear 61%
Sad 23.6%
Calm 6.9%
Disgusted 3.1%
Happy 2.2%
Angry 1.5%
Surprised 1.1%
Confused 0.7%

AWS Rekognition

Age 22-30
Gender Male, 99.6%
Confused 69.3%
Surprised 9.5%
Sad 7.2%
Disgusted 6.1%
Calm 2.9%
Angry 2.7%
Fear 2.1%
Happy 0.3%

AWS Rekognition

Age 45-51
Gender Male, 99.9%
Calm 56%
Sad 24.8%
Confused 5.8%
Disgusted 5.3%
Fear 2.8%
Angry 2.4%
Surprised 1.6%
Happy 1.1%

AWS Rekognition

Age 37-45
Gender Male, 81.3%
Sad 75%
Calm 19.5%
Happy 2.9%
Fear 1.3%
Disgusted 0.4%
Angry 0.3%
Confused 0.3%
Surprised 0.2%

AWS Rekognition

Age 23-31
Gender Male, 96.9%
Calm 81.2%
Angry 7.7%
Sad 7%
Confused 2.4%
Disgusted 0.7%
Surprised 0.4%
Fear 0.3%
Happy 0.2%

AWS Rekognition

Age 20-28
Gender Male, 99.6%
Calm 52.4%
Sad 45.9%
Confused 0.7%
Angry 0.4%
Surprised 0.2%
Disgusted 0.2%
Fear 0.1%
Happy 0.1%

AWS Rekognition

Age 22-30
Gender Male, 85%
Sad 50.9%
Calm 32.5%
Disgusted 4.5%
Fear 4.4%
Angry 3.5%
Happy 2.2%
Confused 1%
Surprised 0.9%

Feature analysis

Amazon

Person 98.6%
Painting 97.2%

Categories