Human Generated Data

Title

Untitled (european traders, Africa)

Date

1910s

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.3177

Human Generated Data

Title

Untitled (european traders, Africa)

People

Artist: Unidentified Artist,

Date

1910s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.7
Human 99.7
Person 99.5
Person 99.5
Person 99.2
Person 99.1
Tree 96.5
Plant 96.5
Arecaceae 96
Palm Tree 96
Outdoors 87.1
Person 78.8
Nature 72.7
Summer 72.1
Person 71.9
Building 66.2
Garden 59.6
Arbour 57.9
Tropical 55.3
Person 51.2

Imagga
created on 2022-01-08

snow 86.1
tree 81.9
forest 53.2
winter 46.9
landscape 46.2
trees 45.4
weather 41.2
park 40.4
woody plant 38.5
cold 32.8
season 30.5
frost 28.8
vascular plant 25.5
autumn 24.6
snowy 24.4
birch 24.2
woods 23.9
scenery 23.5
ice 22.6
outdoors 22.5
scene 20.8
frozen 20.1
sky 19.9
rural 19.4
scenic 19.3
fall 19
river 18.7
peaceful 18.3
branch 18.3
sun 17.2
plant 17.1
morning 16.3
road 16.3
country 15.9
countryside 15.6
freeze 14.6
wood 14.2
travel 14.1
fog 13.5
light 13.4
frosty 12.7
branches 12.6
cemetery 12.5
wilderness 12.1
natural 12.1
field 11.7
silver tree 11.5
seasonal 11.4
leaves 11.4
water 11.4
path 11.4
new 11.4
yellow 11.3
lake 11
leaf 10.9
pine 10.8
snowfall 10.8
recreation 10.8
covered 10.7
trail 10.7
scenics 10.5
day 10.2
bright 10
sunlight 9.8
mountain 9.8
sapling 9.7
mountains 9.3
outdoor 9.2
national 9.1
old 9.1
sunset 9
lane 8.8
grass 8.7
sunrise 8.5
evening 8.4
tourist 8.3
calm 8.3
environment 8.2
colors 7.9
rime 7.9
hoarfrost 7.9
range 7.9
new england 7.9
chill 7.9
evergreen 7.8
icy 7.8
rays 7.7
orange 7.7
poplar 7.5
silhouette 7.5
town 7.4
land 7.4
vacation 7.4
cool 7.1
area 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

outdoor 99.4
tree 98.4
palm tree 86.4
plant 74.7
person 68.6
palm 65.4
park 62.7
black and white 50.7

Face analysis

Amazon

Google

AWS Rekognition

Age 48-56
Gender Female, 99.1%
Angry 73.1%
Calm 24.5%
Sad 0.7%
Happy 0.6%
Disgusted 0.5%
Confused 0.3%
Surprised 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Female, 58.2%
Calm 99.3%
Angry 0.2%
Happy 0.1%
Surprised 0.1%
Confused 0.1%
Sad 0.1%
Disgusted 0%
Fear 0%

AWS Rekognition

Age 34-42
Gender Male, 98.4%
Angry 42.7%
Sad 18.2%
Surprised 9.9%
Disgusted 8.4%
Happy 8.4%
Calm 6%
Confused 4.3%
Fear 2.1%

AWS Rekognition

Age 29-39
Gender Male, 100%
Calm 68%
Happy 21.8%
Surprised 4.6%
Disgusted 2.2%
Sad 1.5%
Angry 0.7%
Confused 0.7%
Fear 0.5%

AWS Rekognition

Age 21-29
Gender Male, 99.6%
Calm 64.6%
Sad 19.6%
Happy 8.1%
Confused 3.3%
Surprised 1.5%
Fear 1.4%
Angry 0.7%
Disgusted 0.7%

AWS Rekognition

Age 11-19
Gender Male, 73.9%
Calm 77.6%
Sad 17.5%
Confused 2%
Happy 1%
Fear 0.8%
Disgusted 0.4%
Surprised 0.4%
Angry 0.1%

AWS Rekognition

Age 31-41
Gender Male, 96.1%
Happy 84.7%
Surprised 7.5%
Calm 3.3%
Sad 1.3%
Fear 1.1%
Disgusted 0.9%
Confused 0.6%
Angry 0.6%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%

Captions

Microsoft

a group of people standing next to a palm tree 95%
a group of men standing next to a palm tree 91.9%
a group of people standing in front of a palm tree 91.8%