Human Generated Data

Title

Untitled (people dressed for luau, next to boat)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16748

Human Generated Data

Title

Untitled (people dressed for luau, next to boat)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99.9
Apparel 99.9
Human 99.8
Person 99.8
Person 99.4
Dress 99.4
Female 98.4
Person 97.7
Person 97.2
Person 95.4
Woman 93.7
Shorts 90.7
Outdoors 86
Skirt 85.9
Nature 85.4
Face 77
Plant 76.7
Girl 68.3
Person 68.2
People 67.9
Rail 66.9
Train Track 66.9
Transportation 66.9
Railway 66.9
Portrait 65.2
Photography 65.2
Photo 65.2
Tree 63.4
Blossom 62.2
Flower 62.2
Suit 61.9
Overcoat 61.9
Coat 61.9
Gown 56.9
Fashion 56.9
Evening Dress 56.9
Robe 56.9
Vehicle 56.7
Teen 55.9
Kid 55.9
Child 55.9
Blonde 55.9
Water 55.6
Ocean 55.4
Sea 55.4

Imagga
created on 2022-02-26

kin 47
beach 34.2
man 26.9
people 25.1
child 24.6
couple 23.5
vacation 22.1
sea 21.9
summer 21.9
outdoors 21.6
person 19.1
love 18.9
adult 18.8
sarong 18.8
male 18.5
together 18.4
ocean 18.2
happy 18.2
water 18
sand 17.7
outdoor 16
sky 15.9
skirt 15
smiling 14.5
lifestyle 14.4
two 14.4
walking 14.2
happiness 14.1
clothing 13.5
world 13.4
family 13.3
parent 13.1
portrait 12.9
outside 12.8
travel 12.7
sunset 12.6
walk 12.4
mother 12.2
garment 12
fun 12
dress 11.7
boy 11.3
active 10.8
holiday 10.7
romantic 10.7
holding 9.9
coast 9.9
romance 9.8
fashion 9.8
seaside 9.8
sun 9.7
bride 9.6
women 9.5
wife 9.5
day 9.4
shore 9.3
lake 9.2
leisure 9.1
pretty 9.1
park 9.1
lady 8.9
husband 8.8
groom 8.7
dad 8.6
relax 8.4
joy 8.4
father 8.3
life 8.3
wedding 8.3
girls 8.2
little 7.9
sunny 7.7
sitting 7.7
men 7.7
tourist 7.7
attractive 7.7
enjoying 7.6
kids 7.5
relationship 7.5
relaxing 7.3
cute 7.2

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

outdoor 91.8
clothing 91.3
black and white 87.6
person 87.4
text 82.2
vehicle 81
woman 77.9
dress 68.5
old 60.9

Face analysis

Amazon

Google

AWS Rekognition

Age 51-59
Gender Male, 99.8%
Happy 99.6%
Calm 0.3%
Surprised 0.1%
Confused 0%
Sad 0%
Disgusted 0%
Angry 0%
Fear 0%

AWS Rekognition

Age 33-41
Gender Male, 81.7%
Calm 80.4%
Happy 18.6%
Sad 0.2%
Surprised 0.2%
Disgusted 0.2%
Confused 0.1%
Angry 0.1%
Fear 0.1%

AWS Rekognition

Age 20-28
Gender Female, 79.4%
Happy 53.5%
Calm 27%
Disgusted 11%
Sad 3.8%
Confused 1.6%
Angry 1.5%
Surprised 1.2%
Fear 0.4%

AWS Rekognition

Age 16-24
Gender Male, 93%
Calm 44.1%
Happy 36.1%
Surprised 7.1%
Angry 5.5%
Sad 4.7%
Confused 1%
Disgusted 0.9%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.8%

Captions

Microsoft

a vintage photo of a group of people standing next to a train 83.6%
a vintage photo of a group of people standing in front of a train 79.1%
a vintage photo of a group of people posing for the camera 79%

Text analysis

Amazon

31
ALVITA
Craft
=
Chair Craft
Chair

Google

31
EVITA
EVITA 31