Human Generated Data

Title

Untitled (wood stove)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1955

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (wood stove)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.8
Human 99.8
Person 99.4
Wood 84.4
Building 73.1
Soil 72.8
Railway 72.2
Rail 72.2
Train Track 72.2
Transportation 72.2
Urban 66.6
Weapon 61.7
Weaponry 61.7
Archaeology 60.6
Ground 60.1
Bunker 59.6
Machine 58.5
Cannon 56.3
Vehicle 55.9

Imagga
created on 2022-01-08

factory 28.3
landscape 25.3
handcart 21.4
old 20.9
barrow 20.4
machine 19.4
rural 19.4
plant 19
wheeled vehicle 17.8
shovel 17.8
outdoor 16.8
tool 16.4
tree 16.2
vehicle 15.5
man 15.5
water 14.7
sky 14.7
countryside 14.6
outdoors 13.6
forest 13.1
grass 12.6
building complex 12
industry 12
field 11.7
travel 11.3
structure 11.2
house 10.9
farm 10.7
environment 10.7
work 10.5
person 10.3
people 10
sunset 9.9
hand tool 9.8
mountain 9.8
destruction 9.8
farmer 9.7
agriculture 9.6
rust 9.6
land 9.5
spring 9.4
male 9.2
industrial 9.1
park 9.1
dirty 9
scenery 9
summer 9
device 8.9
scenic 8.8
light 8.7
antique 8.7
fishing 8.6
construction 8.6
farming 8.5
boat 8.5
clouds 8.4
sunrise 8.4
conveyance 8.3
building 8.2
earth 8.2
rubbish 8.1
transportation 8.1
chain saw 7.9
disaster 7.8
abandoned 7.8
stone 7.7
outside 7.7
rusty 7.6
vintage 7.6
evening 7.5
tractor 7.4
lake 7.3
container 7.3
road 7.2
recreation 7.2
trees 7.1
working 7.1
sea 7
country 7
season 7

Google
created on 2022-01-08

Photograph 94.1
Plant 91
Black 89.7
Tree 87.2
Wood 83.7
Monochrome photography 77
Monochrome 75.9
Gas 74.9
Pollution 73.9
Door 72
Event 68.1
Stock photography 66.7
Landscape 63.6
Soil 60
History 55.9
Motor vehicle 52.8
Cooking 50.4

Microsoft
created on 2022-01-08

outdoor 99.9
tree 99.6
black and white 86.5
person 85.1
picnic 83.8
clothing 58.8

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 53.5%
Calm 83.1%
Sad 10%
Angry 2.5%
Confused 1.7%
Disgusted 1.4%
Fear 0.6%
Happy 0.4%
Surprised 0.3%

AWS Rekognition

Age 23-33
Gender Male, 88.6%
Calm 58.5%
Sad 26.4%
Fear 11.1%
Angry 1.1%
Disgusted 1.1%
Surprised 0.8%
Confused 0.6%
Happy 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a person sitting at a picnic table 72.6%
a group of people sitting at a picnic table 58.5%
a person sitting on a picnic table 58.4%