Human Generated Data

Title

Untitled (two girls standing outside in front of fence)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17303

Human Generated Data

Title

Untitled (two girls standing outside in front of fence)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Shorts 100
Clothing 100
Apparel 100
Person 99.8
Human 99.8
Person 99.8
Shoe 97.2
Footwear 97.2
Shoe 96.8
Shoe 84.7
Female 79
Shoe 77.4
Tree 72.7
Plant 72.7
Girl 70.2
Vegetation 65
Face 61.4
People 60.7
Smile 58.9
Fence 57.1
Child 56.5
Kid 56.5
Bush 55.5

Imagga
created on 2022-02-26

crutch 30.9
staff 23.9
person 22.7
people 20.7
man 19.7
fashion 18.9
adult 18.2
stick 18
model 17.1
posing 16.9
attractive 16.8
dress 16.3
portrait 16.2
sexy 16.1
lady 15.4
body 14.4
sunset 13.5
black 13.5
walking 13.3
male 13
beach 12.7
pretty 12.6
silhouette 12.4
lifestyle 12.3
style 11.9
women 11.9
world 11.5
human 11.3
sensuality 10.9
pose 10.9
dark 10.9
clothing 10.7
hair 10.3
wall 10.3
leg 10.1
fitness 9.9
couple 9.6
street 9.2
outdoor 9.2
vintage 9.1
danger 9.1
sensual 9.1
exercise 9.1
dirty 9
love 8.7
water 8.7
life 8.6
sport 8.5
legs 8.5
child 8.5
relaxation 8.4
summer 8.4
leisure 8.3
holding 8.3
gorgeous 8.2
happy 8.2
active 8.1
standing 7.8
old 7.7
health 7.6
walk 7.6
skin 7.6
elegance 7.6
outdoors 7.5
sun 7.3
looking 7.2
cute 7.2
face 7.1
businessman 7.1

Google
created on 2022-02-26

Footwear 98
Smile 94.9
Gesture 85.3
Dress 83.6
Black-and-white 82.9
Flash photography 81.2
Plant 79.9
Tree 78.9
Tints and shades 77.1
Monochrome photography 72.3
Vintage clothing 72.2
Grass 70.8
Monochrome 70.7
Fence 66.9
Boot 66.9
Room 65.1
Art 63.7
Fun 63.5
Classic 61
Photo caption 60.1

Microsoft
created on 2022-02-26

outdoor 98.6
clothing 97.7
person 96
footwear 92.5
smile 88.2
text 86.6
posing 84.9
human face 77.3
black and white 68.8
woman 53.2

Face analysis

Amazon

Google

AWS Rekognition

Age 13-21
Gender Male, 60.1%
Calm 40.8%
Happy 31.9%
Sad 12.8%
Angry 5.4%
Confused 3.9%
Fear 2.1%
Surprised 1.7%
Disgusted 1.4%

AWS Rekognition

Age 24-34
Gender Female, 53.5%
Calm 81.3%
Fear 9.8%
Happy 5.5%
Sad 1.6%
Disgusted 0.6%
Surprised 0.5%
Confused 0.5%
Angry 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 97.2%

Captions

Microsoft

a person posing for a photo 95%
a person posing for a picture 94.9%
a person posing for a photo in front of a building 89.4%

Text analysis

Amazon

92
E
NACCY
A4
VT27082