Human Generated Data

Title

Cemetery in Queens, N.Y.

Date

1952

People

Artist: Andreas Feininger, American 1906 - 1999

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.763

Human Generated Data

Title

Cemetery in Queens, N.Y.

People

Artist: Andreas Feininger, American 1906 - 1999

Date

1952

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.763

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Landscape 92.1
Outdoors 92.1
Nature 92.1
Asphalt 91.6
Tarmac 91.6
Scenery 86.8
Road 86.3
Panoramic 78.6
Crowd 72.6
Building 70.7
Art 66.1
Aerial View 65.6
Zebra Crossing 56.8
Collage 56.6
Advertisement 56.6
Poster 56.6
Pedestrian 55.6

Clarifai
created on 2023-10-25

people 98.6
abstract 96.7
crowd 96.2
desktop 95.5
street 95.4
many 95.2
group 93.4
art 93.1
urban 92.5
monochrome 92.5
texture 91.3
vintage 91.1
no person 89.9
group together 89.4
retro 89.3
house 89.1
abstraction 87.9
reflection 87.3
man 87
old 86.9

Imagga
created on 2022-01-08

staple 67
paper fastener 53.8
fence 46.3
fastener 45.3
picket fence 38.3
barrier 32.2
grunge 30.6
texture 30.6
restraint 30.4
old 30
wall 26
snow 23.3
textured 22.8
structure 22.2
obstruction 20.4
weathered 19.9
pattern 19.1
vintage 19
surface 18.5
winter 17.9
material 17.9
rough 17.3
dirty 17.2
damaged 17.2
grungy 17.1
device 16.8
aged 16.3
wallpaper 16.1
paint 15.4
tree 15.3
rusty 15.2
wood 14.2
landscape 14.1
backdrop 14
black 13.8
forest 13.1
antique 13
metal 12.9
design 11.8
season 11.7
backgrounds 11.4
building 11.3
brown 11
rust 10.6
frost 10.6
worn 10.5
art 10.4
ancient 10.4
frame 10
retro 9.8
outdoors 9.7
cold 9.5
stone 8.9
trees 8.9
detail 8.8
snowy 8.7
paper 8.6
empty 8.6
blank 8.6
iron 8.4
worm fence 8.4
dark 8.3
metallic 8.3
weather 8.2
industrial 8.2
ice 8.1
close 8
rural 7.9
memorial 7.9
covered 7.8
messy 7.7
decay 7.7
construction 7.7
stain 7.7
industry 7.7
frozen 7.6
gravestone 7.6
park 7.4
mountains 7.4
light 7.3
gray 7.2
colorful 7.2
wooden 7
seasonal 7

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

text 96.8
cemetery 65.8
crowd 47.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 14-22
Gender Female, 53.7%
Calm 43.3%
Happy 17.4%
Sad 12.4%
Angry 9%
Fear 7%
Confused 6.4%
Disgusted 2.6%
Surprised 1.9%

AWS Rekognition

Age 18-24
Gender Male, 88.6%
Calm 24.5%
Fear 23.6%
Sad 23.2%
Surprised 11.9%
Happy 6.5%
Disgusted 4.1%
Angry 3.6%
Confused 2.6%

AWS Rekognition

Age 18-26
Gender Male, 89.5%
Sad 48%
Happy 20.6%
Calm 14.3%
Fear 9.3%
Angry 2.6%
Surprised 2.5%
Confused 1.6%
Disgusted 1.1%

AWS Rekognition

Age 21-29
Gender Male, 75%
Calm 77.3%
Sad 12%
Angry 4%
Happy 2.5%
Fear 1.5%
Disgusted 1.3%
Confused 0.8%
Surprised 0.5%

AWS Rekognition

Age 28-38
Gender Male, 90.5%
Calm 71.7%
Happy 26.2%
Fear 0.4%
Angry 0.4%
Surprised 0.4%
Sad 0.3%
Disgusted 0.3%
Confused 0.2%

AWS Rekognition

Age 22-30
Gender Female, 82.6%
Sad 92.8%
Calm 2.2%
Confused 2%
Surprised 1.5%
Fear 0.5%
Happy 0.3%
Disgusted 0.3%
Angry 0.3%

Categories

Imagga

paintings art 96.7%
text visuals 1.8%