Human Generated Data

Title

Untitled (collecting the sap)

Date

1968

People

Artist: Barbara Norfleet, American 1926 -

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1954

Copyright

© Barbara Norfleet

Human Generated Data

Title

Untitled (collecting the sap)

People

Artist: Barbara Norfleet, American 1926 -

Date

1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1954

Copyright

© Barbara Norfleet

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Person 99.8
Human 99.8
Outdoors 97
Garden 96.1
Gardener 89.5
Worker 89.5
Gardening 89.5
Tree 89.3
Plant 89.3
Portrait 64.8
Photography 64.8
Face 64.8
Photo 64.8
Tree Trunk 60.2
Bucket 55.5

Clarifai
created on 2023-10-25

people 99.5
tree 99.2
one 98.4
monochrome 97.9
bucket 97.1
adult 96.6
wood 95
man 94.8
garden 92
street 89.7
exert 89.6
two 89.4
nature 88.2
trunk 84.1
woman 83.4
park 82.8
old 82.2
farming 80.9
leaf 80.1
child 79.7

Imagga
created on 2022-01-08

tree 35.6
forest 32.2
bucket 26.1
vessel 25.2
outdoor 22.9
outdoors 22.5
container 21.3
park 20.6
people 19.5
man 19.5
woods 19.1
farmer 19
trees 18.7
person 18.1
rural 15.9
male 14.9
autumn 14.9
tool 14.3
chain saw 14.3
adult 14.2
wood 14.2
country 14
grass 12.6
old 12.5
leisure 12.5
machine 12.4
summer 12.2
outside 12
child 11.9
spring 11.8
garden 11.7
walk 11.4
leaves 11.4
natural 11.4
season 10.9
power saw 10.7
power tool 10.7
couple 10.5
happiness 10.2
water jug 10
fall 10
attractive 9.8
landscape 9.7
hiking 9.6
work 9.6
lifestyle 9.4
field 9.2
pretty 9.1
active 9
recreation 9
activity 9
worker 8.9
agriculture 8.8
day 8.6
cleaner 8.5
plant 8.4
foliage 8.3
fun 8.2
branch 8.2
children 8.2
happy 8.1
farm 8
jug 8
working 8
hair 7.9
cute 7.9
love 7.9
boy 7.8
expression 7.7
orange 7.7
casual 7.6
two 7.6
woody plant 7.6
action 7.4
cheerful 7.3
village 7.3
smiling 7.2
portrait 7.1
women 7.1
job 7.1

Google
created on 2022-01-08

Microsoft
created on 2022-01-08

tree 100
outdoor 99.9
clothing 94.8
person 94.7
black and white 88.1
man 85.8
waste container 83.8
monochrome 70.2
plant 39.8

Color Analysis

Face analysis

Google

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%

Text analysis

Amazon

V
V 31
DUTTON
31
M DUTTON
LISURE
M
EXPONDER