Human Generated Data

Title

Untitled (mother in dark dress posed sitting with baby on chair with embroidered decorations)

Date

c. 1930-1945

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10913

Human Generated Data

Title

Untitled (mother in dark dress posed sitting with baby on chair with embroidered decorations)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1930-1945

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10913

Machine Generated Data

Tags

Amazon
created on 2022-02-05

Furniture 99.9
Person 98.3
Human 98.3
Person 95.8
Baby 85
Chair 83.8
Crib 83.6
Indoors 76.2
Clothing 74.6
Apparel 74.6
Face 74.1
Cradle 69.9
Portrait 66.9
Photography 66.9
Photo 66.9
Room 65.4
People 61.3
Newborn 57

Clarifai
created on 2023-10-29

people 99.9
two 97.9
group 97.9
child 97.7
adult 97
family 95
baby 94.1
monochrome 93.9
man 93.3
wear 93.2
three 93.1
woman 92.5
offspring 92.4
portrait 92.3
princess 91.8
wedding 90.6
son 88.8
veil 88.6
administration 87
nostalgia 86.7

Imagga
created on 2022-02-05

nurse 100
patient 35.9
medical 34.4
man 34.3
people 34
person 31.9
hospital 29
male 27.7
doctor 27.2
professional 23.8
health 20.8
adult 18.6
medicine 18.5
illness 18.1
men 18
happy 15.7
occupation 14.7
worker 14.4
work 14.1
room 13.7
laboratory 13.5
family 13.3
barbershop 13.3
senior 13.1
clinic 13
smiling 13
looking 12.8
home 12.8
lab 12.6
profession 12.4
care 12.3
indoors 12.3
shop 12.2
portrait 11.6
sick 11.6
negative 11.3
coat 11.3
mature 11.1
women 11.1
happiness 11
equipment 10.8
surgery 10.7
case 10.7
test 10.6
couple 10.4
biology 10.4
surgeon 10.4
love 10.3
film 9.9
scientist 9.8
old 9.8
human 9.7
job 9.7
exam 9.6
instrument 9.4
casual 9.3
smile 9.3
treatment 9.2
team 9
sick person 8.9
science 8.9
70s 8.8
uniform 8.8
40s 8.8
chemistry 8.7
elderly 8.6
research 8.6
bed 8.5
inside 8.3
mother 8.2
mercantile establishment 8.2
cheerful 8.1
to 8
child 7.9
business 7.9
operation 7.9
face 7.8
physician 7.8
scientific 7.7
chemical 7.7
development 7.6
lab coat 7.6
vintage 7.4
technology 7.4
office 7.2
lifestyle 7.2
kid 7.1
working 7.1

Google
created on 2022-02-05

Microsoft
created on 2022-02-05

text 94.2
clothing 90.4
person 88.6
baby 80.4
toddler 74.8
human face 71.4
drawing 53.1
posing 37.4

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 37-45
Gender Female, 99.9%
Surprised 56.5%
Calm 19.7%
Sad 8.5%
Fear 5%
Happy 3.9%
Angry 3.3%
Disgusted 1.7%
Confused 1.4%

AWS Rekognition

Age 0-3
Gender Male, 95.4%
Calm 57.8%
Happy 19.9%
Surprised 9.3%
Disgusted 6%
Fear 3.1%
Angry 1.9%
Sad 1.3%
Confused 0.7%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 98.3%
Person 95.8%

Categories

Imagga

paintings art 99.8%

Text analysis

Amazon

-

Google

HAGON-YTITA-HAMT2A
HAGON-YTITA-HAMT2A