Human Generated Data

Title

Untitled (baby and little boy on chair)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.16928

Human Generated Data

Title

Untitled (baby and little boy on chair)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Apparel 98.4
Clothing 98.4
Person 95.5
Human 95.5
Person 89.3
Baby 88.9
Chair 85.6
Furniture 85.6
Face 73.2
Painting 68.4
Art 68.4
Food 64.9
Meal 64.9
Coat 64
Finger 62.4
Dish 62
Child 60.9
Kid 60.9
Portrait 60.8
Photography 60.8
Photo 60.8
Newborn 60.5
People 59.7
Couch 58.5
Overcoat 56.1
Suit 56.1
Long Sleeve 55.5
Sleeve 55.5

Imagga
created on 2022-02-26

oxygen mask 65
breathing device 54.6
device 47.4
man 41
male 34
people 31.2
person 28.6
mask 26.3
adult 24.6
men 20.6
portrait 19.4
looking 19.2
face 19.2
human 15.7
professional 15.3
working 15
surgeon 14.5
work 14.1
doctor 14.1
happy 13.8
equipment 13.6
mature 13
business 12.7
suit 12.6
holding 12.4
medical 12.4
safety 12
chemical 11.6
businessman 11.5
couple 11.3
modern 11.2
health 11.1
love 11
lifestyle 10.8
black 10.8
leisure 10.8
smile 10.7
worker 10.7
medicine 10.6
one 10.4
occupation 10.1
profession 9.6
happiness 9.4
smiling 9.4
expression 9.4
senior 9.4
hand 9.1
protection 9.1
fashion 9
technology 8.9
protective covering 8.8
hospital 8.8
brass 8.7
hair 8.7
laboratory 8.7
uniform 8.5
old 8.4
fun 8.2
danger 8.2
office 8.1
instrument 8
science 8
job 8
women 7.9
standing 7.8
sitting 7.7
elderly 7.7
casual 7.6
illness 7.6
communication 7.6
style 7.4
environment 7.4
covering 7.4
helmet 7.2
color 7.2
family 7.1
camera 7.1

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

text 99.5
person 89.1
clothing 67.9

Face analysis

Amazon

AWS Rekognition

Age 4-12
Gender Female, 89.7%
Calm 91.7%
Sad 7.3%
Confused 0.2%
Angry 0.2%
Disgusted 0.2%
Happy 0.2%
Surprised 0.1%
Fear 0.1%

Feature analysis

Amazon

Person 95.5%
Painting 68.4%

Captions

Microsoft

a group of people sitting at a table 66%
a group of people looking at a phone 39.3%
a group of people looking at a laptop 39.2%

Text analysis

Amazon

YT33A2-YAOO