Human Generated Data

Title

Untitled (group portrait with bride and groom)

Date

1945

People

Artist: John Deusing, American active 1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.1651

Human Generated Data

Title

Untitled (group portrait with bride and groom)

People

Artist: John Deusing, American active 1940s

Date

1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Human 98.4
Person 98.4
Person 98
Person 97.4
Apparel 97.1
Clothing 97.1
Person 96.6
Person 94.5
Person 93.7
People 92.1
Person 88.7
Face 80.9
Female 77.8
Gown 77.6
Fashion 77.6
Robe 77
Wedding 67.5
Photography 62.1
Photo 62.1
Woman 62
Family 61.5
Portrait 61.5
Wedding Gown 59.4
Clinic 58.6

Imagga
created on 2021-12-14

silhouette 30.6
people 27.9
man 23.1
team 21.5
person 20.8
male 17.7
design 16.3
art 16
teamwork 15.8
businessman 15
work 14.1
business 14
crowd 13.4
sexy 12.8
president 12.8
vibrant 12.3
bright 12.1
presentation 12.1
group 12.1
fun 12
nighttime 11.7
audience 11.7
stadium 11.7
sport 11.5
patriotic 11.5
job 11.5
boss 11.5
symbol 11.4
black 11.4
nation 11.4
drawing 11.2
men 11.2
lights 11.1
icon 11.1
negative 11
businesswoman 10.9
cheering 10.8
speech 10.8
flag 10.4
portrait 10.4
bride 10
supporters 9.9
leader 9.6
sketch 9.6
groom 9.6
happiness 9.4
grunge 9.4
vivid 9.3
dance 9.2
occupation 9.2
film 9.1
painting 9
retro 9
boutique 8.9
family 8.9
happy 8.8
couple 8.7
love 8.7
party 8.6
color 8.3
representation 8.3
decoration 8.2
dress 8.1
star 8.1
activity 8.1
marriage 7.6
lifestyles 7.6
fashion 7.5
meeting 7.5
vintage 7.4
event 7.4
adult 7.4
girls 7.3
life 7.3
figure 7.3
player 7.2
women 7.1
modern 7

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wedding dress 98.2
text 98.2
bride 95.7
dress 86.1
clothing 80.2
woman 78.7
person 77.8
sketch 67.8
posing 67.6
wedding 51.5
drawing 50.2

Face analysis

Amazon

Google

AWS Rekognition

Age 50-68
Gender Male, 78%
Sad 95.5%
Calm 3.6%
Fear 0.4%
Confused 0.1%
Angry 0.1%
Surprised 0.1%
Happy 0.1%
Disgusted 0%

AWS Rekognition

Age 50-68
Gender Male, 92.8%
Sad 42.8%
Calm 34.7%
Confused 14.9%
Surprised 3.8%
Happy 2%
Disgusted 0.7%
Angry 0.7%
Fear 0.4%

AWS Rekognition

Age 23-37
Gender Female, 60.3%
Happy 75.5%
Sad 12.7%
Calm 10.5%
Confused 0.6%
Surprised 0.5%
Angry 0.1%
Fear 0.1%
Disgusted 0.1%

AWS Rekognition

Age 23-35
Gender Male, 78.6%
Happy 46.8%
Calm 29.9%
Sad 16.5%
Surprised 3.9%
Confused 1.5%
Angry 0.6%
Fear 0.4%
Disgusted 0.3%

AWS Rekognition

Age 24-38
Gender Male, 66.4%
Happy 73.5%
Calm 20%
Sad 2.3%
Surprised 2.1%
Confused 1%
Angry 0.6%
Disgusted 0.4%
Fear 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Captions

Microsoft

a group of people posing for a photo 75.8%
a group of people posing for the camera 75.7%
a group of people posing for a picture 75.6%