Human Generated Data

Title

Untitled (elevated view of woman passing out roses in hall at party)

Date

1959

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.9670

Human Generated Data

Title

Untitled (elevated view of woman passing out roses in hall at party)

People

Artist: Martin Schweig, American 20th century

Date

1959

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-23

Person 99.3
Human 99.3
Indoors 98.9
Interior Design 98.9
Person 98.9
Person 98.7
Person 98.5
Clothing 96.8
Apparel 96.8
Person 96.6
Dress 93.5
Person 87.8
Female 85.4
Room 84.2
People 81.6
Face 73
Costume 70.8
Girl 68.1
Woman 66.7
Photography 65.1
Photo 65.1
Portrait 65.1
Floor 63.7
Furniture 63.4
Person 62.9
Child 62.3
Kid 62.3
Person 61.1
Flooring 57.2
Evening Dress 56.6
Fashion 56.6
Robe 56.6
Gown 56.6

Imagga
created on 2022-01-23

groom 33.3
architecture 32.1
building 26.5
old 20.9
monument 18.7
history 17.9
sculpture 16.6
ancient 15.6
statue 15.4
art 15.2
negative 15.1
historic 14.7
column 14.5
historical 14.1
world 14
people 13.9
marble 13.9
wall 13.7
culture 13.7
house 13.5
person 13.4
travel 13.4
tourism 13.2
kin 13
famous 13
film 12.8
religion 12.5
teacher 12.1
balcony 11.9
style 11.9
stone 11.8
dress 11.7
traditional 11.6
city 11.6
town 11.1
bride 10.7
facade 10.6
life 10.6
window 10.5
tourist 10.5
couple 10.5
church 10.2
adult 10.1
interior 9.7
arch 9.7
bouquet 9.6
man 9.4
wedding 9.2
photographic paper 9.2
educator 9.1
landmark 9
home 8.8
love 8.7
antique 8.7
luxury 8.6
male 8.5
design 8.4
tradition 8.3
room 8.1
decoration 8
women 7.9
portrait 7.8
professional 7.5
china 7.5
classic 7.4
exterior 7.4
new 7.3
color 7.2
romantic 7.1
musical instrument 7

Google
created on 2022-01-23

Microsoft
created on 2022-01-23

person 96.5
dress 95.2
text 88.2
outdoor 87.8
clothing 84.2
woman 78.6
wedding dress 71.2
black and white 63.4

Face analysis

Amazon

Google

AWS Rekognition

Age 29-39
Gender Male, 97.7%
Calm 42.3%
Sad 38.6%
Happy 18.2%
Confused 0.3%
Disgusted 0.2%
Angry 0.2%
Surprised 0.1%
Fear 0.1%

AWS Rekognition

Age 26-36
Gender Male, 99.8%
Fear 67%
Happy 7.8%
Calm 7%
Confused 6.2%
Sad 4.2%
Surprised 4.1%
Angry 2.8%
Disgusted 0.9%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a group of people standing in front of a building 86.8%
a group of people sitting on a bench in front of a building 67.6%
a group of people standing around a bench 67.5%

Text analysis

Amazon

2310C

Google

-AGON
MJIA- -YT A2- -AGON
-YT
A2-
MJIA-