Human Generated Data

Title

Untitled (bearded man seated on bench under tree by small building)

Date

c. 1890 - c. 1915

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.4796

Human Generated Data

Title

Untitled (bearded man seated on bench under tree by small building)

People

Artist: Unidentified Artist,

Date

c. 1890 - c. 1915

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Person 99.4
Human 99.4
Outdoors 99.1
Nature 98.8
Building 95.7
Housing 95.7
Rural 95.2
Countryside 95.2
Shelter 95.2
Furniture 91.9
Plant 83.7
Tree 83.7
House 83
Vegetation 71.5
Hut 69.6
Bench 65.4
Land 64.7
Garden 63.9
Shack 63.7
Forest 61.1
Woodland 61.1
Grove 61.1
Arbour 58.9
Grassland 58.4
Field 58.4
Porch 57.8
Cabin 55.4

Imagga
created on 2022-01-29

wheeled vehicle 73.6
barrow 68.5
handcart 57.3
vehicle 47.4
forest 40.9
tree 37.6
tricycle 37.2
autumn 34.3
park 32.1
landscape 32
conveyance 31.4
trees 30.3
fall 23.5
woods 21
outdoors 19.6
scenic 18.4
rural 17.6
grass 17.4
path 17
season 16.4
old 16
country 15.8
leaves 15.8
foliage 15.7
leaf 15.6
branches 15.4
scenery 15.3
countryside 14.6
outdoor 13.8
peaceful 13.7
colors 13.2
bench 13
winter 12.8
light 12.7
yellow 12.6
wood 12.5
environment 12.3
scene 12.1
sun 12.1
road 11.7
fog 11.6
natural 11.4
travel 11.3
snow 11.2
cold 11.2
branch 11
morning 10.9
hiking 10.6
sunrise 10.3
summer 10.3
spring 10.2
sky 10.2
day 10.2
orange 10
sunset 9.9
sunlight 9.8
building 9.8
golden 9.5
colorful 9.3
garden 9.2
vintage 9.1
people 8.9
seat 8.7
dawn 8.7
walking 8.5
evening 8.4
field 8.4
misty 7.9
lane 7.8
park bench 7.8
seasons 7.8
sunny 7.7
mist 7.7
outside 7.7
grunge 7.7
walk 7.6
brown 7.4
new 7.3
stone 7.3
life 7.2
seasonal 7

Google
created on 2022-01-29

Microsoft
created on 2022-01-29

tree 99.9
outdoor 97.9
grass 97.9
person 80.7
field 79.1
clothing 73.1
text 72
old 62.3
child 60.6
black and white 53.7
plant 52.7
vintage 26.3

Face analysis

Amazon

Google

AWS Rekognition

Age 54-64
Gender Male, 100%
Calm 98.3%
Sad 0.8%
Angry 0.3%
Confused 0.3%
Happy 0.2%
Disgusted 0.1%
Surprised 0.1%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.4%

Captions

Microsoft

a vintage photo of a tree 86.8%
a vintage photo of a forest 84.7%
a vintage photo of some people in a forest 74.4%