Human Generated Data

Title

Untitled (mother in dark dress posed with baby on chair with embroidered decorations)

Date

c. 1930-1945

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10915

Human Generated Data

Title

Untitled (mother in dark dress posed with baby on chair with embroidered decorations)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1930-1945

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Person 98.7
Human 98.7
Person 96.2
Furniture 94.6
Face 86.7
People 86.5
Apparel 81
Clothing 81
Baby 74.3
Female 74.2
Indoors 72.4
Portrait 72.3
Photography 72.3
Photo 72.3
Room 71.6
Newborn 60
Art 59.8
Girl 58.9
Woman 57.9
Skin 57.6
Dress 55.8

Imagga
created on 2022-02-05

nurse 73.6
barbershop 52.9
shop 43.3
mercantile establishment 32.1
man 30.9
people 29.6
patient 28.9
medical 27.4
person 26.1
hospital 25.5
male 23.5
doctor 21.6
place of business 21.4
health 19.4
medicine 17.6
work 16.5
professional 16.4
men 16.3
illness 16.2
adult 15.1
room 14.8
happy 14.4
family 14.2
negative 13.9
home 13.6
clinic 13.3
indoors 13.2
senior 13.1
surgery 12.7
women 12.6
old 12.5
profession 12.4
care 12.3
smiling 11.6
couple 11.3
portrait 11
occupation 11
film 11
70s 10.8
worker 10.8
mother 10.7
establishment 10.7
sick 10.6
looking 10.4
casual 10.2
coat 10.1
equipment 9.9
mask 9.8
science 9.8
standing 9.6
ancient 9.5
bed 9.5
happiness 9.4
surgeon 8.9
operation 8.9
uniform 8.6
elderly 8.6
photographic paper 8.5
house 8.4
camera 8.3
treatment 8.3
human 8.2
religion 8.1
to 8
interior 8
black 7.8
face 7.8
two people 7.8
laboratory 7.7
emergency 7.7
exam 7.7
instrument 7.6
historic 7.3
team 7.2
love 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

drawing 94.6
clothing 93.7
text 91.1
person 90.6
sketch 87.2
posing 74.1
human face 72.2
black and white 64
baby 56.7
toddler 55.6

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 99.9%
Surprised 66.7%
Sad 11.5%
Calm 9.1%
Fear 5.8%
Angry 2.7%
Disgusted 1.6%
Happy 1.4%
Confused 1.1%

AWS Rekognition

Age 0-6
Gender Male, 53.1%
Calm 96%
Surprised 2.4%
Disgusted 0.4%
Sad 0.3%
Angry 0.2%
Fear 0.2%
Confused 0.2%
Happy 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%

Captions

Microsoft

a man standing in front of a refrigerator 53%
a man standing in front of a window posing for the camera 52.9%
a man standing in front of a window 52.8%

Text analysis

Amazon

ه.م

Google

NAGOX-YT3RA2-MAMTZA3
NAGOX-YT3RA2-MAMTZA3