Human Generated Data

Title

Untitled (bride sitting for studio portrait)

Date

1947

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19578

Human Generated Data

Title

Untitled (bride sitting for studio portrait)

People

Artist: Samuel Cooper, American active 1950s

Date

1947

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Tripod 99.7
Person 98.8
Human 98.8
Person 97.6
Clothing 86.5
Apparel 86.5
Sitting 82.4
Person 81.4
Photography 77.6
Photo 77.6
Floor 65.4
Fashion 64.5
Gown 62.7
Wedding 58.6
Studio 57.9
Robe 57
Wedding Gown 56.1
Photographer 56.1
Flooring 55

Imagga
created on 2022-03-05

musical instrument 24.5
male 24.1
man 22.2
people 21.2
person 20.6
wind instrument 18.1
room 17.2
adult 17.1
interior 15
indoors 14.9
chair 14.7
modern 14.7
microphone 14.2
professional 14.1
device 14
black 13.8
style 13.3
brass 13.1
men 12.9
fashion 12.8
music 12.7
business 12.1
musician 12.1
singer 12.1
trombone 11.7
smiling 11.6
group 11.3
concert 10.7
happy 10.6
stage 10.6
performer 10.5
attractive 10.5
office 10.4
table 10.4
lifestyle 10.1
window 10.1
elegance 10.1
studio 9.9
bass 9.8
guitar 9.8
bowed stringed instrument 9.8
businessman 9.7
portrait 9.7
sexy 9.6
home 9.6
standing 9.6
women 9.5
work 9.4
clothing 9.3
house 9.2
stringed instrument 9.2
silhouette 9.1
furniture 9.1
job 8.8
light 8.7
rock 8.7
performance 8.6
instrument 8.6
smile 8.5
photographer 8.4
floor 8.4
dress 8.1
equipment 8
posing 8
cleaner 8
happiness 7.8
color 7.8
model 7.8
life 7.7
harmonica 7.7
musical 7.7
communication 7.6
star 7.5
holding 7.4
occupation 7.3
medical 7.1

Microsoft
created on 2022-03-05

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 54.1%
Sad 26.6%
Surprised 26%
Disgusted 15.4%
Happy 14.6%
Angry 7.5%
Calm 6.5%
Fear 2.4%
Confused 1%

AWS Rekognition

Age 41-49
Gender Female, 82.1%
Calm 94.9%
Happy 1%
Sad 1%
Disgusted 0.9%
Angry 0.6%
Surprised 0.6%
Fear 0.5%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a group of people standing in a room 86.7%
a group of people in a room 86.6%
a group of people that are standing in a room 80.9%

Text analysis

Amazon

2
ЬВЕГА