Human Generated Data

Title

Untitled (two photographs: guests raising glasses to bride and groom at head of table; five men and women posing in kitchen during wedding reception)

Date

1954, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6189

Human Generated Data

Title

Untitled (two photographs: guests raising glasses to bride and groom at head of table; five men and women posing in kitchen during wedding reception)

People

Artist: Martin Schweig, American 20th century

Date

1954, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6189

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.6
Human 99.6
Person 99.6
Stage 99.4
Person 99.3
Person 99.2
Person 98.3
Person 98
Person 97.3
Clothing 96.4
Apparel 96.4
Person 94.8
Person 90.5
Leisure Activities 68.9
Person 68.2
Indoors 56.7
Room 56.7
Sleeve 56.1
Dance Pose 55.9

Clarifai
created on 2019-11-16

people 99.9
adult 98.4
group 98.2
woman 96.1
wear 95.4
furniture 95.1
man 95
room 94.2
many 92.1
actress 91.5
child 90.6
group together 90.2
indoors 87.6
family 86.3
outfit 85.2
theater 85.2
movie 85.1
music 85
administration 84.3
two 82.6

Imagga
created on 2019-11-16

man 20.2
people 20.1
person 18.8
newspaper 16.3
coat 15.7
old 15.3
lab coat 15
couple 14.8
male 14.2
product 13.5
religion 13.4
bride 13.4
dress 12.6
adult 12.2
art 11.9
portrait 11
creation 10.9
happy 10.6
business 10.3
wedding 10.1
computer 9.9
office 9.8
one 9.7
clothing 9.6
love 9.5
world 9.3
smile 9.3
garment 9.1
history 8.9
new 8.9
family 8.9
businessman 8.8
groom 8.8
looking 8.8
holy 8.7
holiday 8.6
black 8.4
church 8.3
vintage 8.3
indoor 8.2
home 8
working 7.9
work 7.8
happiness 7.8
monitor 7.8
room 7.8
two 7.6
bouquet 7.5
religious 7.5
back 7.3
smiling 7.2
statue 7.1
women 7.1
professional 7
case 7

Google
created on 2019-11-16

Photograph 95.9
Snapshot 83.3
Room 65.7
Photography 62.4
Black-and-white 56.4
Art 50.2
Family 50.2

Microsoft
created on 2019-11-16

clothing 96.6
person 92.4
woman 88.1
text 85.5
black and white 72.9
white 68.3
drawing 64.6
dress 62.3
man 53.3
old 47.1
clothes 46.7
posing 36.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-40
Gender Female, 54.9%
Calm 53.5%
Surprised 45.1%
Happy 45.2%
Sad 45.4%
Confused 45.2%
Angry 45.2%
Fear 45.1%
Disgusted 45.2%

AWS Rekognition

Age 22-34
Gender Female, 54.5%
Surprised 45.3%
Happy 48%
Sad 46.1%
Fear 45.2%
Calm 47.4%
Confused 47%
Disgusted 45.5%
Angry 45.6%

AWS Rekognition

Age 42-60
Gender Female, 54.8%
Fear 45.1%
Calm 45.1%
Surprised 45.1%
Confused 45.1%
Disgusted 45.2%
Happy 54.1%
Sad 45.2%
Angry 45.2%

AWS Rekognition

Age 31-47
Gender Male, 54.3%
Confused 45.2%
Fear 45.2%
Sad 45.2%
Disgusted 46%
Happy 51.8%
Angry 45.5%
Calm 45.9%
Surprised 45.2%

AWS Rekognition

Age 19-31
Gender Male, 52.1%
Calm 45%
Happy 54.7%
Angry 45%
Disgusted 45%
Fear 45.1%
Sad 45.1%
Confused 45%
Surprised 45%

AWS Rekognition

Age 41-59
Gender Male, 54.9%
Fear 45.2%
Calm 48.4%
Angry 45.2%
Happy 50.4%
Confused 45.1%
Disgusted 45.1%
Sad 45.1%
Surprised 45.4%

AWS Rekognition

Age 19-31
Gender Female, 50.3%
Angry 49.5%
Surprised 49.5%
Sad 49.6%
Happy 49.7%
Calm 49.5%
Fear 50.1%
Confused 49.5%
Disgusted 49.5%

AWS Rekognition

Age 23-35
Gender Male, 50%
Angry 49.5%
Surprised 49.5%
Disgusted 49.5%
Confused 49.5%
Fear 50.5%
Calm 49.5%
Happy 49.5%
Sad 49.5%

AWS Rekognition

Age 39-57
Gender Female, 50.5%
Angry 49.5%
Sad 50.4%
Confused 49.5%
Surprised 49.5%
Happy 49.5%
Disgusted 49.5%
Calm 49.5%
Fear 49.5%

AWS Rekognition

Age 20-32
Gender Male, 54.7%
Disgusted 45.1%
Angry 45.2%
Fear 45.6%
Happy 45.1%
Confused 45.1%
Sad 51.8%
Calm 46.9%
Surprised 45.2%

Feature analysis

Amazon

Person 99.6%

Categories