Human Generated Data

Title

Untitled (bride and groom with "reserved" sign at table)

Date

1950

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19551

Human Generated Data

Title

Untitled (bride and groom with "reserved" sign at table)

People

Artist: Samuel Cooper, American active 1950s

Date

1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 99.7
Person 99.7
Person 98.7
Tabletop 97.6
Furniture 97.6
Face 96.7
Clothing 96.5
Apparel 96.5
Home Decor 95.1
Person 94.1
Table 89.3
Indoors 89
Meal 85.4
Food 85.4
Dining Table 83.9
Room 83.3
Plant 82.3
Nature 80.3
Dish 79.8
Female 79.7
Person 79.6
Linen 78.7
Outdoors 77.7
Chair 77
Sunglasses 75.5
Accessories 75.5
Accessory 75.5
Vase 74.1
Pottery 74.1
Jar 74.1
Potted Plant 73.5
Suit 72.3
Coat 72.3
Overcoat 72.3
Photography 71.7
Portrait 71.7
Photo 71.7
Building 66.2
Housing 66.2
Woman 64.7
Skin 63.8
Blossom 62.5
Flower 62.5
Art 61
Girl 60.2
Head 59.8
Dining Room 59.5
Man 59
Glass 58.3
Bridegroom 57.6
Wedding 57.6
Desk 56.3

Imagga
created on 2022-03-05

man 46.3
person 35.5
male 34.9
nurse 31.3
people 25.6
patient 23
men 21.5
working 21.2
coat 19.7
professional 19.7
lab coat 19.7
medical 19.4
worker 19.4
work 19
adult 18.7
businessman 16.8
job 15.9
business 15.8
team 15.2
smiling 15.2
hospital 15.2
doctor 15
office 14.6
indoors 14
health 13.9
room 13.8
occupation 13.7
sitting 13.7
medicine 13.2
clinic 13
home 12.8
life 12.1
happy 11.3
case 11.3
hand 10.6
uniform 10.6
cheerful 10.6
teacher 10.4
student 10.4
holding 9.9
laboratory 9.6
30s 9.6
talking 9.5
women 9.5
sick person 9.4
clothing 9.1
kitchen 8.9
to 8.8
interior 8.8
lab 8.7
boy 8.7
mid adult 8.7
happiness 8.6
profession 8.6
thinking 8.5
industry 8.5
casual 8.5
senior 8.4
portrait 8.4
communication 8.4
teamwork 8.3
laptop 8.3
technology 8.2
computer 8
looking 8
smile 7.8
scientist 7.8
equipment 7.8
employee 7.8
surgery 7.8
two people 7.8
colleagues 7.8
meeting 7.5
instrument 7.5
human 7.5
care 7.4
garment 7.3
confident 7.3
businesswoman 7.3
black 7.2
table 7

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

text 98.1
window 90.2
black and white 77.4
clothing 70.7
person 69.9
posing 41.4
clothes 28.8

Face analysis

Amazon

Google

AWS Rekognition

Age 49-57
Gender Male, 91.9%
Calm 95.7%
Sad 1.6%
Happy 0.7%
Surprised 0.7%
Confused 0.4%
Disgusted 0.4%
Angry 0.3%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Male, 73.2%
Surprised 66.7%
Calm 28.2%
Happy 3.8%
Confused 0.8%
Disgusted 0.3%
Angry 0.1%
Sad 0.1%
Fear 0.1%

AWS Rekognition

Age 38-46
Gender Male, 92.2%
Surprised 93.2%
Calm 4%
Happy 1.9%
Angry 0.2%
Fear 0.2%
Disgusted 0.2%
Sad 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Sunglasses 75.5%

Captions

Microsoft

a group of stuffed animals sitting next to a window 64.8%
a group of people posing for a photo 64.7%
a group of people posing for the camera 64.6%

Text analysis

Amazon

RESERVED.
S
2

Google

a
2
47
RESERVED,
YTERA2
a YTERA2 2 47 RESERVED,