Human Generated Data

Title

Untitled (panorama of women with barrels)

Date

1890s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Dr. Robert Drapkin, 2.2002.2723

Human Generated Data

Title

Untitled (panorama of women with barrels)

People

Artist: Unidentified Artist,

Date

1890s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Gift of Dr. Robert Drapkin, 2.2002.2723

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 97.6
Human 97.6
Outdoors 96.9
Nature 95.8
Bicycle 94.5
Transportation 94.5
Vehicle 94.5
Bike 94.5
Person 94.4
Person 92.3
Person 92
Person 87.8
Helmet 86.5
Clothing 86.5
Apparel 86.5
Photographer 76
Snow 68.9
Person 62.8
Photography 62.8
Photo 62.8

Clarifai
created on 2023-10-25

people 100
group 99.2
adult 99.1
man 98.3
wear 97.8
child 97.8
veil 96.3
woman 95.2
group together 94.5
four 93.1
several 92.6
two 92.6
three 92.3
art 91.9
print 91.5
military 90
weapon 89.5
gun 86.4
soldier 85.9
recreation 85.3

Imagga
created on 2022-01-08

binoculars 32.9
robe 32
cemetery 29.4
photographer 25.6
optical instrument 24.8
garment 22.6
instrument 20.6
man 19.3
clothing 19.1
male 16.3
device 16.3
people 16.2
gun 13
person 12.8
danger 12.7
adult 12.3
sky 12.1
travel 12
building 11.9
weapon 11.8
war 11.6
landscape 11.1
mask 10.8
landmark 10.8
history 10.7
tourism 10.7
outdoor 10.7
military 10.6
beach 10.1
statue 9.8
covering 9.5
sea 9.4
equipment 9.1
leisure 9.1
protection 9.1
one 8.9
soldier 8.8
rock 8.7
scene 8.6
stone 8.6
uniform 8.5
safety 8.3
island 8.2
alone 8.2
religion 8.1
work 7.8
architecture 7.8
portrait 7.8
men 7.7
old 7.7
consumer goods 7.6
horizontal 7.5
fun 7.5
monument 7.5
famous 7.4
sport 7.4
tourist 7.2
coast 7.2
holiday 7.2
summer 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

clothing 95.3
person 93.3
old 74.1
text 67.1
clothes 15.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 37-45
Gender Female, 68.7%
Calm 63.4%
Angry 12.9%
Sad 12.2%
Disgusted 4%
Surprised 2.6%
Confused 2.5%
Fear 1.3%
Happy 1%

AWS Rekognition

Age 41-49
Gender Female, 96.4%
Surprised 32.4%
Calm 19.9%
Fear 16%
Sad 12.9%
Happy 11.1%
Confused 2.8%
Disgusted 2.6%
Angry 2.3%

AWS Rekognition

Age 22-30
Gender Male, 99%
Fear 91.5%
Calm 3%
Angry 1.8%
Happy 1.2%
Disgusted 1%
Sad 0.9%
Surprised 0.4%
Confused 0.3%

AWS Rekognition

Age 10-18
Gender Female, 97.8%
Happy 46.2%
Calm 24.7%
Sad 14.3%
Fear 7.2%
Angry 2.4%
Surprised 1.9%
Confused 1.7%
Disgusted 1.6%

AWS Rekognition

Age 31-41
Gender Male, 93.8%
Sad 36.1%
Happy 32.8%
Fear 7.3%
Surprised 6.6%
Angry 6.5%
Calm 4.9%
Disgusted 3.9%
Confused 1.9%

Feature analysis

Amazon

Person 97.6%
Bicycle 94.5%
Helmet 86.5%