Human Generated Data

Title

Untitled (girl standing next to tree)

Date

1964

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16834

Human Generated Data

Title

Untitled (girl standing next to tree)

People

Artist: Lucian and Mary Brown, American

Date

1964

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16834

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.5
Apparel 99.5
Person 99.3
Human 99.3
Dress 96.4
Shorts 94.8
Female 94.6
Tree 94.4
Plant 94.4
Nature 91.1
Vegetation 90.8
Outdoors 90.5
Face 89.8
Rock 88.7
Standing 82.7
Woman 77.3
Girl 73.2
Grass 72.9
Portrait 72.7
Photography 72.7
Photo 72.7
Woodland 70.3
Forest 70.3
Land 70.3
Ground 68.1
People 63.8
Kid 62.3
Child 62.3
Skirt 59.6
Blonde 59
Teen 59

Clarifai
created on 2023-10-29

people 99.9
one 99.4
child 99
adult 98.3
monochrome 96.1
two 95.7
woman 94.7
portrait 93.9
wear 93.2
girl 91.2
man 89.2
group together 87.6
group 86.4
tree 86
veil 84.9
park 82.5
son 81.9
administration 81.9
facial expression 79.7
dress 78.2

Imagga
created on 2022-02-26

child 25.9
tree 24.6
forest 23.5
park 23.1
outdoors 21.3
people 21.2
outdoor 17.6
landscape 17.1
walking 16.1
monk 15.9
stone 15.7
travel 15.5
trees 15.1
person 14.9
hiking 14.4
walk 14.3
ascent 14.2
summer 14.1
rock 13.9
slope 13.3
autumn 12.3
boy 12.2
grass 11.9
hike 11.7
mountain 11.6
woods 11.5
couple 11.3
mountains 11.1
love 11.1
man 10.8
natural 10.7
old 10.5
outside 10.3
sky 10.2
day 10.2
sport 10.1
tourist 9.7
scenic 9.7
life 9.6
male 9.6
adventure 9.5
light 9.4
megalith 9.2
memorial 9.2
leisure 9.1
adult 9.1
fall 9.1
world 8.9
mother 8.7
lifestyle 8.7
path 8.5
countryside 8.2
road 8.1
active 8.1
activity 8.1
sun 8.1
water 8
juvenile 7.9
country 7.9
black 7.9
trunk 7.7
happy 7.5
wood 7.5
tourism 7.4
structure 7.3
smiling 7.2
parent 7.2
sunset 7.2
childhood 7.2
bright 7.2
portrait 7.1
women 7.1
season 7
leaves 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.6
clothing 92.4
person 90.7
text 85
black and white 83.1
footwear 73.1
monochrome 52.1

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 26-36
Gender Female, 94.2%
Calm 71.9%
Sad 6.6%
Surprised 5.2%
Happy 4.8%
Disgusted 4.5%
Confused 3%
Fear 2.6%
Angry 1.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.3%

Categories

Text analysis

Amazon

3
4

Google

MJ17--YTERA2- -NAa
MJ17--YTERA2-
-NAa