Human Generated Data

Title

Untitled (bride and groom posed looking at three old women sitting on couch)

Date

1949

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6179

Human Generated Data

Title

Untitled (bride and groom posed looking at three old women sitting on couch)

People

Artist: Martin Schweig, American 20th century

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6179

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.5
Human 99.5
Clothing 99.4
Apparel 99.4
Person 98.9
Person 98.8
Person 97.7
Person 97.5
Face 91.5
Suit 89.2
Coat 89.2
Overcoat 89.2
People 87.2
Indoors 76.3
Room 76.3
Furniture 76.2
Chair 76.2
Photography 69.6
Portrait 69.6
Photo 69.6
Female 62.5
Shirt 59
Text 58.6
Tuxedo 57.9
Man 57.2
Senior Citizen 57
Steamer 55.4

Clarifai
created on 2023-10-26

people 99.9
group 99.6
leader 98.5
group together 98.3
adult 97.2
several 96.9
man 96.8
woman 95.5
administration 95.5
elderly 93.6
three 91.2
many 90.7
recreation 90.6
five 87.9
wear 86.9
chair 86.4
four 85.7
actor 80.3
handshake 80
child 79.9

Imagga
created on 2022-01-23

people 29
couple 27.9
happy 25
male 24.5
adult 24.2
person 23.9
bride 23
man 22.8
dress 22.6
groom 21.4
love 21.3
happiness 21.1
wedding 20.2
portrait 19.4
women 18.2
fashion 18.1
smiling 17.3
smile 17.1
two 16.9
teacher 16.5
child 16.3
cheerful 16.2
family 16
husband 15.6
pretty 14
clothing 13.7
home 13.5
marriage 13.3
boy 13
kin 12.9
married 12.5
bouquet 12.4
wife 12.3
lady 12.2
celebration 12
attractive 11.9
life 11.9
ceremony 11.6
face 11.4
men 11.2
gown 10.9
cute 10.8
mother 10.6
old 10.4
world 10
romance 9.8
kid 9.7
human 9.7
new 9.7
full length 9.7
indoors 9.7
together 9.6
flowers 9.6
lifestyle 9.4
holiday 9.3
children 9.1
blackboard 9.1
holding 9.1
style 8.9
educator 8.8
hair 8.7
hands 8.7
party 8.6
grandma 8.5
youth 8.5
elegance 8.4
hand 8.3
color 8.3
outdoors 8.2
year 8.2
professional 8.2
handsome 8
wed 7.9
day 7.8
brunette 7.8
standing 7.8
black 7.8
education 7.8
bridal 7.8
gift 7.7
musical instrument 7.7
wall 7.7
enjoying 7.6
traditional 7.5
retro 7.4
girls 7.3
celebrate 7.2
childhood 7.2
interior 7.1

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 99.6
clothing 94.8
man 92.5
woman 88.6
text 86.9
wedding dress 79.3
human face 73.9
bride 62.6
old 48.5

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 97.8%
Calm 65.2%
Sad 12.6%
Happy 5.8%
Confused 5.7%
Disgusted 3.2%
Surprised 3.1%
Fear 2.7%
Angry 1.6%

AWS Rekognition

Age 54-62
Gender Male, 99%
Calm 93.2%
Sad 3.4%
Happy 1.1%
Confused 0.9%
Surprised 0.8%
Disgusted 0.2%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 51-59
Gender Male, 99.2%
Calm 64.8%
Sad 21%
Surprised 11.4%
Confused 1.2%
Happy 0.6%
Angry 0.6%
Fear 0.2%
Disgusted 0.2%

AWS Rekognition

Age 41-49
Gender Male, 92%
Calm 98.5%
Sad 1.3%
Surprised 0.1%
Angry 0%
Confused 0%
Happy 0%
Disgusted 0%
Fear 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Possible
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Text analysis

Amazon

EIRW
лигсо SALEJA EIRW
SALEJA
лигсо

Google

ts 2 S st MJ17 WT3ㅋA2 0320
S
MJ17
WT3
A2
0320
ts
2
st