Human Generated Data

Title

Untitled (bride and groom with 'reserved' sign)

Date

1957

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19610

Human Generated Data

Title

Untitled (bride and groom with 'reserved' sign)

People

Artist: Samuel Cooper, American active 1950s

Date

1957

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Person 99.5
Human 99.5
Person 98.4
Apparel 94.5
Clothing 94.5
Person 91.7
Food 87.9
Meal 87.9
Face 86.4
Furniture 83.5
Chair 83.5
Dish 82.6
Person 80.6
Curtain 69.1
Table 68
Potted Plant 66
Pottery 66
Plant 66
Jar 66
Vase 66
Photo 62.8
Photography 62.8
People 62.8
Flower 62.3
Blossom 62.3
Restaurant 62.1
Suit 59.9
Coat 59.9
Overcoat 59.9
Text 59.5
Shirt 55.2

Imagga
created on 2022-03-05

man 38.9
person 36.2
male 31.2
people 26.8
adult 24.9
white 20
couple 19.2
happy 18.8
portrait 16.8
smiling 16.6
holding 15.7
sitting 15.5
clothing 15.4
face 14.9
business 14.6
smile 14.2
kin 14
lifestyle 13.7
home 13.6
love 13.4
work 12.5
indoors 12.3
computer 12
attractive 11.9
happiness 11.7
seller 11.6
working 11.5
black 11.4
together 11.4
men 11.2
dress 9.9
cheerful 9.7
businessman 9.7
restaurant 9.7
looking 9.6
husband 9.5
coat 9.5
adults 9.5
room 9.2
laptop 9.1
professional 9.1
handsome 8.9
office 8.8
spectator 8.8
mid adult 8.7
30s 8.7
bride 8.6
groom 8.6
casual 8.5
communication 8.4
20s 8.2
indoor 8.2
life 8.1
bartender 8
job 8
women 7.9
color 7.8
garment 7.7
married 7.7
fan 7.6
two 7.6
hand 7.6
doctor 7.5
robe 7.5
mask 7.4
wedding 7.4
confident 7.3
businesswoman 7.3
worker 7.2
world 7.2
celebration 7.2
family 7.1
day 7.1
medical 7.1

Google
created on 2022-03-05

Microsoft
created on 2022-03-05

person 98.2
black and white 94
clothing 93.2
human face 91.8
text 90.7
smile 86.5
man 84.7
monochrome 51.1

Face analysis

Amazon

Google

AWS Rekognition

Age 36-44
Gender Male, 99.6%
Calm 95.2%
Happy 2.6%
Sad 0.7%
Surprised 0.5%
Disgusted 0.3%
Confused 0.3%
Angry 0.2%
Fear 0.1%

AWS Rekognition

Age 33-41
Gender Female, 88.6%
Happy 49%
Surprised 48.4%
Angry 0.8%
Disgusted 0.5%
Confused 0.4%
Calm 0.4%
Fear 0.2%
Sad 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Captions

Microsoft

a man standing next to a window 72.9%
a man standing in front of a window 72.8%
a group of people standing next to a window 69%

Text analysis

Amazon

RESERVED.
LE
SUPER

Google

RESERVED
RESERVED