Human Generated Data

Title

Untitled (three women and baby)

Date

c. 1950

People

Artist: John Howell, American active 1930s-1960s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21695

Human Generated Data

Title

Untitled (three women and baby)

People

Artist: John Howell, American active 1930s-1960s

Date

c. 1950

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.21695

Machine Generated Data

Tags

Amazon
created on 2022-03-11

Person 99.2
Human 99.2
Clothing 99.1
Apparel 99.1
Person 97.8
Person 95
Chair 88.9
Furniture 88.9
Worker 83.8
Hairdresser 72.8
Female 62.3
Girl 59.1
Pajamas 55.3

Clarifai
created on 2023-10-22

people 99.7
woman 98.9
adult 98.8
monochrome 97.6
group 97.4
man 96.5
three 96.4
two 96.2
sit 96.2
elderly 94.7
indoors 92.5
medical practitioner 88.6
facial expression 88.4
wear 87.6
sitting 86.5
nostalgia 86.3
room 83.7
four 83.6
retro 83.5
hospital 83.1

Imagga
created on 2022-03-11

person 47.3
negative 43.1
film 35.9
people 35.1
patient 35
man 32.9
photographic paper 26.3
adult 25.1
medical 22.1
professional 21.8
male 21.3
doctor 19.7
case 19.6
medicine 18.5
sick person 18.4
health 18.1
photographic equipment 17.5
salon 17
clinic 16.5
work 15.7
laboratory 15.4
science 15.1
happy 15
equipment 15
education 14.7
test 14.4
hospital 14.4
smile 14.2
looking 13.6
portrait 13.6
human 13.5
technology 13.4
men 12.9
home 12.8
student 12.7
working 12.4
instrument 12.3
nurse 12.3
fashion 12.1
occupation 11.9
women 11.9
lab 11.7
team 11.6
worker 11.6
smiling 11.6
lifestyle 11.6
indoors 11.4
research 11.4
room 11.4
researcher 10.8
scientist 10.8
technician 10.8
newspaper 10.7
care 10.7
mask 10.7
chemistry 10.6
chemical 10.6
modern 10.5
development 10.5
glasses 10.2
clothing 9.9
chemist 9.8
coat 9.8
surgeon 9.8
biotechnology 9.8
assistant 9.7
scientific 9.7
illness 9.5
biology 9.5
shop 9.4
senior 9.4
casual 9.3
attractive 9.1
planner 9
one 9
biochemistry 8.9
product 8.7
teacher 8.4
camera 8.3
girls 8.2
lady 8.1
family 8
interior 8
life 7.9
business 7.9
observation 7.9
microbiology 7.9
happiness 7.8
surgery 7.8
black 7.8
old 7.7
style 7.4
indoor 7.3
uniform 7.2
face 7.1
creation 7.1

Google
created on 2022-03-11

Hat 87.9
Black-and-white 85.2
Style 83.9
Eyewear 78.1
Vintage clothing 73.9
Monochrome photography 72.9
Monochrome 72.3
Pattern 70.8
Event 69.6
Sitting 68.4
Classic 67.5
Shelf 62.3
Room 62.2
Retro style 58.1
Fun 57.2
Sunglasses 56
Font 55.6
Photo caption 54.7
History 51.8
Suit 51.1

Microsoft
created on 2022-03-11

person 99.5
text 99.1
clothing 91.5
black and white 81.2
human face 81.1
smile 67.3

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 45-51
Gender Female, 93.4%
Happy 90.2%
Sad 4.8%
Calm 2.2%
Confused 1.3%
Angry 0.5%
Disgusted 0.4%
Surprised 0.4%
Fear 0.3%

AWS Rekognition

Age 48-56
Gender Female, 94.3%
Calm 95.9%
Sad 2.7%
Happy 1%
Confused 0.1%
Angry 0.1%
Disgusted 0.1%
Surprised 0.1%
Fear 0%

AWS Rekognition

Age 43-51
Gender Female, 83.6%
Happy 98.4%
Sad 1.2%
Surprised 0.1%
Calm 0.1%
Angry 0.1%
Confused 0.1%
Fear 0%
Disgusted 0%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person
Person 99.2%
Person 97.8%
Person 95%

Text analysis

Amazon

1089-1