Human Generated Data

Title

Untitled (man, woman, and three children outside)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16760

Human Generated Data

Title

Untitled (man, woman, and three children outside)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Apparel 100
Shorts 100
Clothing 100
Human 99.8
Person 99.8
Person 99.8
Person 99.4
Person 99.1
Person 98.1
Shoe 92.4
Footwear 92.4
Female 91.2
Vegetation 79.3
Plant 79.3
Woman 74
Transportation 69.8
Train Track 68.6
Railway 68.6
Rail 68.6
Skirt 67.6
Tree 67.1
Standing 64.3
Girl 63.8
Outdoors 63.5
Face 61.7
Portrait 61.7
Photography 61.7
Photo 61.7
Ground 61.3
Forest 58.8
Woodland 58.8
Land 58.8
Nature 58.8
Urban 56.8

Imagga
created on 2022-02-26

swing 44.2
mechanical device 39.1
plaything 34.9
mechanism 29.1
child 25.2
people 21.8
man 20.8
adult 20.7
outdoor 18.4
portrait 16.8
outdoors 15.9
person 15.2
male 15
love 14.2
kin 13.8
body 13.6
sunset 13.5
park 13.2
happy 13.2
lifestyle 13
pretty 12.6
silhouette 12.4
autumn 12.3
parent 12
fun 12
sport 11.8
athlete 11.7
sexy 11.2
dad 11.2
youth 11.1
grass 11.1
summer 10.9
play 10.3
hair 10.3
black 10.2
happiness 10.2
beach 10.1
joy 10
leisure 10
active 9.9
kid 9.7
couple 9.6
boy 9.6
hands 9.6
women 9.5
runner 9.4
relax 9.3
field 9.2
girls 9.1
attractive 9.1
holding 9.1
dirty 9
health 9
human 9
father 9
sibling 8.8
forest 8.7
naked 8.7
model 8.6
resort area 8.4
dark 8.4
action 8.3
fashion 8.3
one 8.2
life 8.2
romance 8
family 8
posing 8
outside 7.7
old 7.7
grunge 7.7
two 7.6
hand 7.6
track 7.6
walking 7.6
relaxation 7.5
free 7.5
vacation 7.4
cheerful 7.3
lady 7.3
sensuality 7.3
exercise 7.3
smiling 7.2
fitness 7.2
cute 7.2
recreation 7.2
activity 7.2
face 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 97
text 95.9
clothing 93
person 92.1
footwear 88.8
black and white 78.2
fog 69.3
tree 68.4

Face analysis

Amazon

Google

AWS Rekognition

Age 35-43
Gender Male, 99.4%
Happy 61.4%
Calm 26.1%
Confused 3.2%
Surprised 3.1%
Fear 2.8%
Disgusted 1.3%
Sad 1.3%
Angry 0.8%

AWS Rekognition

Age 18-26
Gender Female, 89.7%
Sad 74.1%
Calm 23.2%
Fear 1.4%
Confused 0.4%
Surprised 0.3%
Happy 0.2%
Disgusted 0.2%
Angry 0.2%

AWS Rekognition

Age 37-45
Gender Female, 97.7%
Calm 78.1%
Happy 9.7%
Sad 6%
Surprised 1.8%
Fear 1.5%
Confused 1.3%
Angry 0.8%
Disgusted 0.7%

AWS Rekognition

Age 21-29
Gender Male, 95.2%
Calm 96.1%
Sad 2.9%
Disgusted 0.3%
Happy 0.3%
Angry 0.2%
Surprised 0.1%
Fear 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.8%
Shoe 92.4%

Captions

Microsoft

a group of people standing around a fire hydrant 28.7%
a group of people that are standing in the grass 28.6%
a couple of people that are standing in the grass 28.5%