Human Generated Data

Title

Untitled (bride and groom)

Date

1948

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.429.11

Human Generated Data

Title

Untitled (bride and groom)

People

Artist: Martin Schweig, American 20th century

Date

1948

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.429.11

Machine Generated Data

Tags

Amazon
created on 2019-03-25

Person 99.7
Human 99.7
Person 97.2
Apparel 95.5
Clothing 95.5
Coat 94.8
Overcoat 94.8
Suit 94.5
Tuxedo 83.5
Finger 78
Leisure Activities 73.5
Face 60.1

Clarifai
created on 2019-03-25

people 99.9
two 99.3
adult 98.6
man 97.9
group 96.8
leader 96.4
three 96.2
administration 96
woman 95.4
portrait 95.1
actor 91.9
group together 91.7
actress 89.4
facial expression 88.4
four 87.7
music 86.9
wear 85.8
chair 83.2
retro 81.9
wedding 81.4

Imagga
created on 2019-03-25

groom 79.5
man 37.7
couple 37.5
happy 33.2
people 31.8
male 29.5
love 28.4
person 27.2
happiness 25.1
smiling 23.9
adult 22.9
home 22.3
smile 21.4
together 21
bride 18.2
attractive 18.2
lifestyle 18.1
two 17.8
sitting 17.2
dress 17.2
family 16.9
wedding 16.6
married 16.3
cheerful 15.5
celebration 15.2
romantic 15.1
meeting 15.1
mature 14.9
group 14.5
portrait 14.2
women 14.2
wife 14.2
indoors 14.1
holding 14
office 13.7
mother 13.5
business 13.4
marriage 13.3
enjoying 13.3
table 13
men 12.9
husband 12.7
drink 12.5
romance 12.5
standing 12.2
suit 12
waiter 12
fun 12
businesswoman 11.8
bouquet 11.8
senior 11.3
casual 11
handsome 10.7
businessman 10.6
executive 10.6
boy 10.4
corporate 10.3
day 10.2
successful 10.1
child 10
professional 9.8
restaurant 9.6
elderly 9.6
tie 9.5
bow tie 9.5
party 9.5
pair 9.4
work 9.4
relationship 9.4
wine 9.2
room 9.2
face 9.2
indoor 9.1
success 8.9
dining-room attendant 8.8
30s 8.7
drinking 8.6
glass 8.6
adults 8.5
laughing 8.5
old 8.4
joy 8.4
aged 8.1
daughter 8.1
looking 8
interior 8
partners 7.8
partner 7.7
date 7.7
horizontal 7.5
worker 7.5
house 7.5
employee 7.4
lady 7.3
confident 7.3
clothing 7.3
celebrate 7.2
black 7.2
team 7.2
holiday 7.2
kitchen 7.2

Google
created on 2019-03-25

Microsoft
created on 2019-03-25

person 99.7
wall 95.4
old 42.6
black and white 11.6
wedding 7
retro 5.5
monochrome 4.2

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 57-77
Gender Male, 98.5%
Calm 0.3%
Happy 89.9%
Sad 1%
Angry 1.6%
Disgusted 1.2%
Confused 1.7%
Surprised 4.2%

AWS Rekognition

Age 26-43
Gender Female, 99.7%
Happy 95.6%
Sad 0.2%
Confused 0.6%
Angry 1.2%
Disgusted 0.8%
Calm 0.1%
Surprised 1.6%

Microsoft Cognitive Services

Age 37
Gender Female

Microsoft Cognitive Services

Age 63
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Suit 94.5%

Categories