Human Generated Data

Title

Untitled (woman holding baby in hospital room with attending nurse)

Date

1949

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6246

Human Generated Data

Title

Untitled (woman holding baby in hospital room with attending nurse)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1949

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6246

Machine Generated Data

Tags

Amazon
created on 2022-01-22

Person 99.1
Human 99.1
Person 88
Furniture 86.8
Person 76.6
Finger 76.1
Home Decor 73.4
Indoors 65.8
Clinic 64.5
Room 63.9
Painting 63.2
Art 63.2
Bed 62
Clothing 60.3
Apparel 60.3
Chair 58.7

Clarifai
created on 2023-10-26

people 99.7
man 98.9
adult 98.7
monochrome 98.7
woman 98
child 95.7
portrait 95.5
art 95
wear 93
group 92.1
baby 90.3
two 88.4
illness 86.6
sit 85.8
hospital 85.6
indoors 85.2
couple 84.9
bed 84.6
actor 83.2
three 82.1

Imagga
created on 2022-01-22

person 29
man 28.2
male 22.7
mask 22.6
people 22.3
surgeon 21.7
adult 17.6
patient 16.5
computer 14.6
business 14.6
work 14.2
laptop 14.1
technology 14.1
professional 13.8
worker 13.3
astronaut 13
sitting 12.9
office 12.8
black 12.7
equipment 12.6
modern 11.9
working 11.5
one 11.2
men 10.3
suit 9.9
medical 9.7
medicine 9.7
helmet 9.5
doctor 9.4
lifestyle 9.4
nurse 9.3
smile 9.3
occupation 9.2
protection 9.1
job 8.8
businessman 8.8
sick person 8.8
case 8.6
profession 8.6
face 8.5
portrait 8.4
hospital 8.4
hand 8.3
human 8.2
danger 8.2
happy 8.1
covering 7.9
happiness 7.8
costume 7.8
corporate 7.7
war 7.7
attractive 7.7
health 7.6
fashion 7.5
future 7.4
style 7.4
glasses 7.4
car 7.4
safety 7.4
light 7.3
smiling 7.2
device 7.2
table 7.2
team 7.2
conceptual 7
indoors 7
protective covering 7

Google
created on 2022-01-22

Microsoft
created on 2022-01-22

text 99.1
person 91.1
black and white 82.8
clothing 81
man 56.5
picture frame 7.2

Color Analysis

Face analysis

Amazon

Google

AWS Rekognition

Age 38-46
Gender Male, 67.3%
Happy 71.9%
Sad 9.3%
Calm 9.3%
Surprised 3.3%
Confused 3.2%
Disgusted 1.2%
Angry 0.9%
Fear 0.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.1%
Painting 63.2%

Captions

Text analysis

Amazon

NAGOX