Human Generated Data

Title

Untitled (man, woman, and boy standing outside large house)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17787

Human Generated Data

Title

Untitled (man, woman, and boy standing outside large house)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.7
Apparel 99.7
Coat 99
Human 99
Person 99
Person 96.6
Person 94.2
Accessories 83.9
Tie 83.9
Accessory 83.9
Overcoat 80.1
People 79.1
Dress 78.1
Road 74.9
Suit 74.6
Gravel 72.8
Dirt Road 72.8
Face 71.5
Standing 69.8
Female 66.4
Photo 66.3
Portrait 66.3
Photography 66.3
Ground 63.7
Text 62.1
Kid 61.2
Child 61.2
Hand 59.6
Drawing 55.7
Art 55.7

Imagga
created on 2022-02-26

kin 61.7
sunset 25.2
man 24.2
people 23.4
male 19.3
silhouette 19
person 18.9
beach 17.7
outdoor 15.3
outdoors 14.9
summer 14.8
world 14.1
sky 14
couple 13.9
black 13.8
love 13.4
walking 13.3
adult 13.2
child 12.4
boy 12.2
sun 12.1
men 12
sport 11.8
happiness 11.8
portrait 11.7
family 11.6
water 11.3
youth 11.1
life 10.4
women 10.3
lifestyle 10.1
happy 10
fan 10
leisure 10
dusk 9.5
relationship 9.4
light 9.4
evening 9.3
two 9.3
ocean 9.1
mother 9.1
hand 9.1
art 9.1
park 9.1
businessman 8.8
body 8.8
together 8.8
sea 8.6
walk 8.6
outside 8.6
athlete 8.4
old 8.4
joy 8.4
health 8.3
vacation 8.2
dress 8.1
follower 8.1
romantic 8
run 7.7
sibling 7.6
statue 7.6
head 7.6
field 7.5
dark 7.5
sunrise 7.5
fun 7.5
vintage 7.4
holding 7.4
father 7.4
exercise 7.3
sexy 7.2
fitness 7.2
active 7.2
recreation 7.2
religion 7.2
parent 7.2
sunlight 7.1
grass 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

ground 98.5
outdoor 97.5
standing 96.2
grass 95.2
house 85.6
field 78.6
clothing 74.4
posing 65.3
person 62.9
old 58.3

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Female, 63.5%
Happy 99%
Confused 0.2%
Sad 0.2%
Surprised 0.1%
Calm 0.1%
Angry 0.1%
Fear 0.1%
Disgusted 0%

AWS Rekognition

Age 39-47
Gender Male, 99.7%
Happy 43.2%
Calm 33.6%
Surprised 19.7%
Disgusted 1.1%
Sad 0.8%
Confused 0.7%
Angry 0.5%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Female, 97.7%
Happy 99.1%
Calm 0.5%
Sad 0.1%
Confused 0.1%
Angry 0.1%
Surprised 0.1%
Fear 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Unlikely

Feature analysis

Amazon

Coat 99%
Person 99%
Tie 83.9%

Captions

Microsoft

a man standing in front of a building 86.7%
an old man standing in front of a building 83.8%
a man standing in front of an old building 83.7%