Human Generated Data

Title

Untitled (bride sitting for studio portrait)

Date

1947

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19578

Human Generated Data

Title

Untitled (bride sitting for studio portrait)

People

Artist: Samuel Cooper, American active 1950s

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19578

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Tripod 99.7
Person 98.8
Human 98.8
Person 97.6
Clothing 86.5
Apparel 86.5
Sitting 82.4
Person 81.4
Photography 77.6
Photo 77.6
Floor 65.4
Fashion 64.5
Gown 62.7
Wedding 58.6
Studio 57.9
Robe 57
Wedding Gown 56.1
Photographer 56.1
Flooring 55

Clarifai
created on 2023-10-22

people 99.6
monochrome 98.6
wedding 98
two 96.9
bride 96.3
veil 94.8
curtain 94.1
woman 93.1
art 93
adult 92.9
couple 92.2
family 91.6
dress 91.5
man 90.9
one 90.6
wear 90.3
furniture 90.3
indoors 90.2
chair 89.8
room 89.2

Imagga
created on 2022-03-05

musical instrument 24.5
male 24.1
man 22.2
people 21.2
person 20.6
wind instrument 18.1
room 17.2
adult 17.1
interior 15
indoors 14.9
chair 14.7
modern 14.7
microphone 14.2
professional 14.1
device 14
black 13.8
style 13.3
brass 13.1
men 12.9
fashion 12.8
music 12.7
business 12.1
musician 12.1
singer 12.1
trombone 11.7
smiling 11.6
group 11.3
concert 10.7
happy 10.6
stage 10.6
performer 10.5
attractive 10.5
office 10.4
table 10.4
lifestyle 10.1
window 10.1
elegance 10.1
studio 9.9
bass 9.8
guitar 9.8
bowed stringed instrument 9.8
businessman 9.7
portrait 9.7
sexy 9.6
home 9.6
standing 9.6
women 9.5
work 9.4
clothing 9.3
house 9.2
stringed instrument 9.2
silhouette 9.1
furniture 9.1
job 8.8
light 8.7
rock 8.7
performance 8.6
instrument 8.6
smile 8.5
photographer 8.4
floor 8.4
dress 8.1
equipment 8
posing 8
cleaner 8
happiness 7.8
color 7.8
model 7.8
life 7.7
harmonica 7.7
musical 7.7
communication 7.6
star 7.5
holding 7.4
occupation 7.3
medical 7.1

Microsoft
created on 2022-03-05

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 54.1%
Sad 26.6%
Surprised 26%
Disgusted 15.4%
Happy 14.6%
Angry 7.5%
Calm 6.5%
Fear 2.4%
Confused 1%

AWS Rekognition

Age 41-49
Gender Female, 82.1%
Calm 94.9%
Happy 1%
Sad 1%
Disgusted 0.9%
Angry 0.6%
Surprised 0.6%
Fear 0.5%
Confused 0.4%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.8%
Person 97.6%
Person 81.4%

Categories

Imagga

interior objects 96.8%
paintings art 2.9%

Text analysis

Amazon

2
ЬВЕГА