Human Generated Data

Title

Untitled (bride sitting for studio portrait)

Date

1947

People

Artist: Samuel Cooper, American active 1950s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19579

Human Generated Data

Title

Untitled (bride sitting for studio portrait)

People

Artist: Samuel Cooper, American active 1950s

Date

1947

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.19579

Machine Generated Data

Tags

Amazon
created on 2022-03-05

Human 97.5
Person 97.1
Clothing 94.9
Apparel 94.9
Cow 87.5
Animal 87.5
Mammal 87.5
Cattle 87.5
Sitting 73.8
Musician 68.7
Musical Instrument 68.7
Photography 67
Photo 67
Tripod 65.9
Flooring 64.6
Floor 64.2
Horse 63.6
Chair 63
Furniture 63
Dance 58.2
Wedding 57.1
Female 57

Clarifai
created on 2023-10-22

wedding 99.3
people 99.2
bride 99
monochrome 99
veil 98.7
dress 96.6
wear 95.9
two 95.7
groom 94
woman 92.6
adult 92.5
outfit 92.2
man 91.3
actress 90.8
couple 90.4
bridal 89.8
fashion 89.2
art 88.8
music 88
marriage 87.6

Imagga
created on 2022-03-05

crutch 72.6
staff 56.2
stick 43.5
cleaner 29.5
man 23.5
person 22.2
adult 22
male 20.6
people 20.1
fashion 13.6
outdoors 13.4
black 13.3
swab 13.1
equipment 12.8
one 12.7
dress 11.7
men 11.2
cleaning implement 11.1
portrait 11
sport 10.7
hand 10.6
walking 10.4
work 10.2
tripod 9.7
energy 9.2
attractive 9.1
human 9
posing 8.9
working 8.8
body 8.8
home 8.8
couple 8.7
standing 8.7
professional 8.5
old 8.4
holding 8.2
alone 8.2
industrial 8.2
sexy 8
looking 8
costume 7.8
hat 7.7
pretty 7.7
industry 7.7
studio 7.6
clothing 7.6
house 7.5
clothes 7.5
style 7.4
business 7.3
suit 7.2
support 7.2
sword 7.1
women 7.1
worker 7.1
interior 7.1
job 7.1
travel 7
indoors 7

Microsoft
created on 2022-03-05

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 27-37
Gender Female, 96.4%
Surprised 93.9%
Calm 2%
Happy 1.5%
Fear 0.8%
Disgusted 0.8%
Angry 0.5%
Sad 0.4%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Cow
Horse
Person 97.1%
Cow 87.5%
Horse 63.6%

Categories

Text analysis

Amazon

a
SUBERGAN