Human Generated Data

Title

Untitled (bride and groom with "reserved" sign)

Date

1950

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19548

Human Generated Data

Title

Untitled (bride and groom with "reserved" sign)

People

Artist: Samuel Cooper, American active 1950s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.7
Person 99.7
Person 99.5
Person 99.2
Chair 97.4
Furniture 97.4
Meal 94.7
Food 94.7
Interior Design 93.7
Indoors 93.7
Clothing 90.2
Apparel 90.2
Face 89.6
Dish 86.9
Couch 84.8
Sitting 81.5
Female 81.3
Table 79.5
Room 79.3
Restaurant 71.4
Living Room 70.3
Woman 68.1
People 66.1
Helmet 65.8
Suit 65
Coat 65
Overcoat 65
Photography 63.2
Photo 63.2
Girl 62.9
Child 60.1
Kid 60.1
Head 59.5
Crowd 58.8
Housing 58.8
Building 58.8
Cafeteria 57.2

Imagga
created on 2022-03-05

shower cap 43
cap 35.5
headdress 30.9
man 27.5
person 23.2
people 22.8
portrait 22.6
male 22
clothing 20.8
adult 20.7
hair 18.2
senior 17.8
negative 16.8
face 16.3
human 15.7
film 14.7
covering 14.6
looking 14.4
black 14.3
worker 13.3
mask 13.2
happy 13.1
men 12.9
attractive 12.6
old 12.5
elderly 12.4
model 12.4
work 11.8
hand 11.4
lady 11.4
fashion 11.3
consumer goods 11.2
sexy 11.2
pretty 11.2
mature 11.1
health 11.1
professional 11
photographic paper 10.6
eyes 10.3
smile 10
medical 9.7
one 9.7
equipment 9.5
lifestyle 9.4
working 8.8
device 8.6
casual 8.5
doctor 8.5
leisure 8.3
occupation 8.2
danger 8.2
love 7.9
expression 7.7
age 7.6
power 7.5
smoke 7.4
glasses 7.4
makeup 7.3
alone 7.3
industrial 7.3
photographic equipment 7.2
life 7.2
body 7.2
women 7.1
night 7.1
medicine 7
indoors 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.1
black and white 90.2
person 85.5
clothing 80.4
man 75.1

Face analysis

Amazon

Google

AWS Rekognition

Age 30-40
Gender Male, 54.3%
Happy 61.8%
Calm 29.9%
Surprised 3.8%
Confused 1.5%
Angry 1.1%
Disgusted 1%
Sad 0.5%
Fear 0.5%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Helmet 65.8%
Suit 65%

Captions

Microsoft

text 51.3%

Text analysis

Amazon

35
RESERVED.
9
9 LIEW
LIEW

Google

5
RESERVED. 6 3 5 35
35
6
RESERVED.
3