Human Generated Data

Title

Untitled (women standing and sitting around large buffet table)

Date

1953

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9506

Human Generated Data

Title

Untitled (women standing and sitting around large buffet table)

People

Artist: Martin Schweig, American 20th century

Date

1953

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-28

Person 99.6
Human 99.6
Person 99.5
Person 99.5
Person 99.1
Person 97.1
Person 97.1
Person 96.9
Apparel 93.9
Clothing 93.9
Dress 90.1
People 89.4
Food 82.1
Dessert 82.1
Person 81.8
Cake 78.7
Female 75.1
Cream 72.5
Icing 72.5
Creme 72.5
Crowd 68.2
Shop 67.3
Woman 66.4
Photography 63.9
Photo 63.9
Portrait 62.3
Face 62.3
Torte 59.6
Overcoat 58.5
Coat 58.5
Suit 58.5
Person 54.9

Imagga
created on 2022-01-28

couple 34.8
kin 30.2
home 28.7
people 28.4
man 28.4
person 23.4
adult 22.1
male 22
table 21.4
women 21.3
happy 21.3
dinner 19.4
bride 18.8
groom 18.7
cheerful 17.9
family 17.8
smiling 17.3
love 17.3
food 17.2
two 16.9
happiness 16.4
sitting 16.3
men 16.3
kitchen 16.1
drink 15.9
interior 15
wine 14.8
party 14.6
romantic 14.2
bouquet 14.1
indoors 14
room 13.9
celebration 13.5
lifestyle 13
wedding 12.9
day 12.5
together 12.3
flowers 12.2
restaurant 12.1
lunch 12.1
mature 12.1
meal 12
romance 11.6
husband 11.4
smile 11.4
marriage 11.4
wife 11.4
glass 11.3
senior 11.2
portrait 11
ceremony 10.7
old 10.4
friends 10.3
life 10.2
indoor 10
house 10
decoration 10
dress 9.9
modern 9.8
mother 9.7
enjoying 9.5
chair 9.5
alcohol 9.3
pretty 9.1
outdoors 9
businessman 8.8
business 8.5
clothing 8.5
relationship 8.4
friendship 8.4
eating 8.4
event 8.3
holding 8.2
cooking 7.9
daughter 7.7
30s 7.7
married 7.7
drinking 7.6
tabletop 7.6
casual 7.6
dining 7.6
leisure 7.5
executive 7.2
teacher 7.2
breakfast 7.1
musical instrument 7

Google
created on 2022-01-28

Microsoft
created on 2022-01-28

person 93.3
clothing 92
woman 89
wedding dress 85
dress 83.1
text 74.8
vase 68.5
bride 55.9
table 26.7
several 12.5
dining table 10.6
dining room 7.1

Face analysis

Amazon

Google

AWS Rekognition

Age 48-54
Gender Male, 99.9%
Calm 56.9%
Confused 29.3%
Sad 8.1%
Happy 2.2%
Surprised 1.6%
Angry 0.9%
Disgusted 0.8%
Fear 0.3%

AWS Rekognition

Age 47-53
Gender Male, 85.7%
Sad 81.1%
Confused 4.3%
Calm 3.9%
Happy 3.2%
Disgusted 2.5%
Fear 1.9%
Surprised 1.7%
Angry 1.4%

AWS Rekognition

Age 45-53
Gender Female, 96.1%
Calm 97.2%
Sad 1.9%
Confused 0.2%
Angry 0.2%
Happy 0.2%
Disgusted 0.2%
Surprised 0%
Fear 0%

AWS Rekognition

Age 50-58
Gender Female, 88%
Sad 58.6%
Calm 28.5%
Confused 4.5%
Happy 2.2%
Fear 2%
Surprised 1.5%
Disgusted 1.4%
Angry 1.4%

AWS Rekognition

Age 40-48
Gender Male, 99.7%
Sad 74.9%
Confused 13.8%
Calm 6.2%
Happy 2.6%
Angry 1.6%
Disgusted 0.5%
Surprised 0.3%
Fear 0.2%

AWS Rekognition

Age 22-30
Gender Female, 88.7%
Happy 83.7%
Calm 9.5%
Sad 2.9%
Fear 1.4%
Surprised 0.9%
Angry 0.7%
Confused 0.5%
Disgusted 0.3%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very likely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Captions

Microsoft

a group of people standing in front of a window 83.4%
a group of people standing around a table 83.3%
a group of people standing in front of a cake 83.2%

Text analysis

Amazon

EXIT
22
O
32
18 O 32
18
KODVK-EVEETA

Google

EXIT
EXIT