Human Generated Data

Title

Untitled (man and boy walking outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17710

Human Generated Data

Title

Untitled (man and boy walking outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 100
Apparel 100
Person 99.6
Human 99.6
Person 97.9
Female 95.6
Fashion 94.1
Gown 94.1
Robe 94
Wedding 90.4
Dress 88.7
Woman 85.1
Plant 83.1
Vehicle 82.6
Automobile 82.6
Car 82.6
Transportation 82.6
Wedding Gown 80.2
Bride 79.2
Accessories 77
Accessory 77
Tie 77
Evening Dress 74.1
Grass 67.9
Photography 67.2
Photo 67.2
Outdoors 64.6
Portrait 63.9
Face 63.9
Girl 63.1
Bridegroom 62.9
Tree 60.5
Suit 57.9
Overcoat 57.9
Coat 57.9

Imagga
created on 2022-02-26

world 39.5
man 28.9
silhouette 24.9
person 23.6
people 22.9
sunset 22.5
male 18.6
umbrella 18
beach 16
water 15.4
sky 15.3
outdoors 14.8
sport 14.6
adult 14.2
dark 14.2
canopy 13.8
summer 13.5
ocean 12.5
sun 12.3
light 11.3
men 11.2
landscape 11.2
fountain 11
sea 11
lifestyle 10.8
vacation 10.6
travel 10.6
black 10.5
dusk 10.5
couple 10.5
boy 10.4
standing 10.4
walking 10.4
shelter 10.3
happiness 10.2
adolescent 10.1
happy 10
leisure 10
outdoor 9.9
clothing 9.8
love 9.5
evening 9.3
clouds 9.3
relaxation 9.2
danger 9.1
life 9
active 9
juvenile 8.7
walk 8.6
adventure 8.5
serene 8.5
sunrise 8.4
human 8.3
park 8.2
dirty 8.1
structure 8.1
businessman 8
fog 7.7
coat 7.7
old 7.7
hand 7.6
groom 7.6
freedom 7.3
body 7.2
recreation 7.2
protective covering 7.2
wet 7.2
sand 7.1
lab coat 7
together 7
attractive 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 99.5
black and white 93.2
text 87.5
golf 85.1
monochrome 57.4
clothing 53.9
tree 52.6

Face analysis

Amazon

Google

AWS Rekognition

Age 34-42
Gender Female, 89%
Surprised 77%
Angry 12.9%
Calm 5.4%
Happy 2.9%
Fear 0.5%
Confused 0.4%
Sad 0.4%
Disgusted 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.6%
Car 82.6%
Tie 77%

Captions

Microsoft

a person standing in front of a building 66%
a person standing in front of a building 61.1%
a person standing next to a building 57.7%