Human Generated Data

Title

Tableau in Torcello

Date

1970s

People

Artist: Benjamin Hertzberg, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.2591

Human Generated Data

Title

Tableau in Torcello

People

Artist: Benjamin Hertzberg, American 20th century

Date

1970s

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 99.7
Person 99.7
Person 99.5
Person 96.8
Person 95
Person 90.1
Outdoors 87
Shorts 86.6
Clothing 86.6
Apparel 86.6
Vehicle 80
Boat 80
Transportation 80
Nature 79
Building 78.7
Plant 73.9
Tree 73.9
Urban 70.1
Wall 69.1
Person 69.1
People 66.6
Countryside 64.7
Standing 61.9
Rural 56.3
Bunker 56

Imagga
created on 2021-12-14

cemetery 46.4
old 37
architecture 29.8
castle 29.3
stone 27.3
landscape 26
building 24.8
grunge 23.9
wall 22.5
structure 21.5
travel 21.1
jigsaw puzzle 20.8
vintage 20.7
city 20
ancient 19.9
landmark 19.9
history 18.8
gravestone 18.7
antique 17.3
sky 17.3
palace 17
puzzle 16.5
house 16.4
historic 15.6
retro 15.6
memorial 15.3
tower 15.2
tree 15.1
fortification 14.9
tourism 14.9
texture 14.6
fortress 14.2
historical 14.1
trees 13.3
aged 12.7
paper 12.6
medieval 12.5
roof 12.4
scenery 11.7
river 11.6
autumn 11.4
scenic 11.4
game 11.3
forest 11.3
frame 10.8
hill 10.3
sepia 9.7
rural 9.7
grungy 9.5
town 9.3
grain 9.2
outdoor 9.2
lake 9.2
rough 9.1
art 9.1
park 9.1
brown 8.8
textured 8.8
houses 8.7
grass 8.7
water 8.7
aging 8.6
old fashioned 8.6
weathered 8.6
buildings 8.5
monument 8.4
famous 8.4
window 8.2
countryside 8.2
outdoors 8.2
style 8.2
dirty 8.1
yellow 8
artistic 7.8
black 7.8
stained 7.7
culture 7.7
rustic 7.7
panorama 7.6
snow 7.4
church 7.4
tourist 7.4
branch 7.3
paint 7.3
sun 7.2
fall 7.2
color 7.2
religion 7.2
country 7
season 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

grass 100
outdoor 99.9
tree 99.4
black and white 87
text 81.8
person 55.9
grave 54.3
old 42.2

Face analysis

Amazon

Google

AWS Rekognition

Age 23-37
Gender Male, 98.9%
Surprised 62.9%
Angry 23.9%
Calm 10.6%
Disgusted 1.6%
Confused 0.5%
Fear 0.3%
Happy 0.2%
Sad 0.1%

AWS Rekognition

Age 25-39
Gender Female, 88.8%
Happy 65.3%
Calm 30.5%
Sad 1.8%
Fear 1.3%
Surprised 0.4%
Disgusted 0.3%
Confused 0.2%
Angry 0.2%

AWS Rekognition

Age 22-34
Gender Female, 98.9%
Happy 92.8%
Calm 4.2%
Sad 1.9%
Angry 0.3%
Fear 0.3%
Confused 0.2%
Surprised 0.2%
Disgusted 0.1%

AWS Rekognition

Age 28-44
Gender Female, 77.6%
Happy 86.9%
Calm 12.5%
Sad 0.3%
Angry 0.1%
Surprised 0.1%
Fear 0.1%
Confused 0.1%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Boat 80%

Captions

Microsoft

a group of people in a field 75.6%
a group of people standing on top of a grass covered field 61.1%
a group of people that are standing in the grass 61%