Human Generated Data

Title

Untitled (three men and three women doing pyramid formation in living room)

Date

1965-1970

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10143

Human Generated Data

Title

Untitled (three men and three women doing pyramid formation in living room)

People

Artist: Martin Schweig, American 20th century

Date

1965-1970

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-29

Clothing 99.4
Apparel 99.4
Person 99.3
Human 99.3
Person 98.4
Person 98.1
Person 97.8
Person 96.9
Person 86.2
Coat 84.3
Indoors 71.1
People 68.5
Face 62.4
Photo 62.4
Portrait 62.4
Photography 62.4
Door 60.8
Room 59.4
Female 57.1
Accessories 55
Sunglasses 55
Accessory 55

Imagga
created on 2022-01-29

groom 100
bride 52.1
dress 47
wedding 46.9
couple 42.7
people 38.5
love 37.1
married 35.5
person 33.7
happiness 32.1
adult 30.4
marriage 29.5
bouquet 29.3
happy 25.1
two 23.7
man 23.5
ceremony 23.3
women 22.2
veil 21.6
celebration 21.5
male 21.3
flowers 20.9
smile 20.7
gown 19.1
fashion 18.9
portrait 18.8
wife 18
family 17.8
nurse 17.3
husband 17.2
men 17.2
church 16.7
smiling 16.6
romantic 16
elegance 16
suit 15.3
romance 15.2
wed 14.7
looking 14.4
together 14
clothing 13.9
face 13.5
day 13.3
attractive 13.3
flower 13.1
cheerful 13
lady 12.2
outdoors 11.9
bridal 11.7
engagement 11.6
new 11.3
human 11.3
matrimony 10.8
patient 10.8
kiss 10.7
holding 10.7
clothes 10.3
model 10.1
boutique 10.1
newly 9.9
pretty 9.8
posing 9.8
loving 9.5
lifestyle 9.4
life 9.3
event 9.2
room 9.2
joy 9.2
hand 9.1
park 9.1
old 9.1
commitment 8.9
indoors 8.8
hair 8.7
hands 8.7
holiday 8.6
formal 8.6
outside 8.6
wall 8.6
rose 8.4
relationship 8.4
outdoor 8.4
future 8.4
summer 8.4
tradition 8.3
one 8.2
religion 8.1
tuxedo 7.9
case 7.7
youth 7.7
ring 7.6
togetherness 7.6
emotion 7.4
teacher 7.4
professional 7.2
black 7.2
home 7.2
cute 7.2
interior 7.1

Microsoft
created on 2022-01-29

indoor 87.1
black and white 76.3
text 76.2
black 68.7
room 59.7

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Male, 56.1%
Happy 78.8%
Surprised 15.5%
Calm 2.3%
Confused 1.1%
Sad 0.9%
Angry 0.7%
Disgusted 0.5%
Fear 0.2%

AWS Rekognition

Age 36-44
Gender Male, 98.9%
Calm 71.1%
Sad 21.1%
Surprised 2.3%
Happy 1.7%
Confused 1.4%
Disgusted 1%
Angry 1%
Fear 0.5%

AWS Rekognition

Age 50-58
Gender Male, 99.6%
Surprised 82.3%
Calm 11.4%
Happy 3.9%
Disgusted 1.3%
Angry 0.5%
Confused 0.4%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 24-34
Gender Female, 69.7%
Calm 77.8%
Happy 8%
Angry 6.9%
Sad 2.3%
Fear 1.4%
Surprised 1.3%
Confused 1.3%
Disgusted 1.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people standing in a room 82.4%
a group of stuffed animals in a room 44.1%
a group of men standing next to a window 44%

Text analysis

Amazon

MAGOZ
PAGOA

Google

IH
AY
202100 S INT IH AY Swww 134LALLLLLLL www.
202100
Swww
S
INT
134LALLLLLLL
www.