Human Generated Data

Title

Untitled (man in chair holding puppet and baby)

Date

c. 1950

People

Artist: Lucian and Mary Brown, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.17788

Human Generated Data

Title

Untitled (man in chair holding puppet and baby)

People

Artist: Lucian and Mary Brown, American

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-02-26

Clothing 99
Apparel 99
Person 98.4
Human 98.4
Furniture 97.2
Chair 96.7
Person 95.9
Interior Design 94.6
Indoors 94.6
Couch 89.8
Person 89.7
Person 85.1
Living Room 83.7
Room 83.7
Face 79.1
Brick 78.8
Shorts 74.6
Sitting 72.7
Portrait 69.3
Photography 69.3
Photo 69.3
Baby 67.1
People 65.8
Floor 60.7
Hat 60
Pants 59.1
Female 56.8
Screen 56.7
Monitor 56.7
Electronics 56.7
Display 56.7
Newborn 55.1

Imagga
created on 2022-02-26

man 39
people 37.9
person 32
male 30.6
room 27.4
couple 26.1
professional 25.7
home 25.5
adult 25.4
teacher 21.5
indoors 21.1
men 20.6
smiling 20.2
sitting 19.7
happy 19.4
work 19
office 18.3
table 16.6
women 16.6
medical 15.9
smile 15.7
worker 15.3
doctor 15
working 15
together 14.9
happiness 14.9
business 14.6
team 14.3
interior 14.1
lifestyle 13.7
two 13.5
modern 13.3
businessman 13.2
portrait 12.9
patient 12.9
computer 12.8
family 12.4
educator 12.2
groom 12.1
mature 12.1
coat 11.7
desk 11.7
nurse 11.6
cheerful 11.4
meeting 11.3
group 11.3
laptop 10.9
hospital 10.6
talking 10.5
clothing 10.4
health 10.4
negative 10.4
senior 10.3
love 10.3
teamwork 10.2
film 10.1
life 9.7
job 9.7
chair 9.7
husband 9.5
businesspeople 9.5
wife 9.5
corporate 9.4
clinic 9.4
house 9.2
barbershop 9.1
classroom 9
human 9
technology 8.9
medicine 8.8
lab 8.7
colleagues 8.7
laboratory 8.7
shop 8.5
togetherness 8.5
communication 8.4
old 8.4
holding 8.3
care 8.2
indoor 8.2
dress 8.1
suit 8.1
new 8.1
looking 8
bride 7.9
conference 7.8
education 7.8
color 7.8
attractive 7.7
elderly 7.7
casual 7.6
horizontal 7.5
instrument 7.5
wedding 7.4
lady 7.3
businesswoman 7.3
mother 7.3
bright 7.1
romantic 7.1
furniture 7.1
student 7.1
grandfather 7

Google
created on 2022-02-26

Microsoft
created on 2022-02-26

clothing 89.3
text 86.7
black and white 84.7
person 77
room 46.5

Face analysis

Amazon

Google

AWS Rekognition

Age 6-16
Gender Female, 95%
Calm 96.3%
Sad 2%
Confused 1.1%
Happy 0.2%
Surprised 0.1%
Angry 0.1%
Disgusted 0.1%
Fear 0%

AWS Rekognition

Age 10-18
Gender Male, 86%
Calm 98%
Happy 1.2%
Surprised 0.3%
Disgusted 0.1%
Sad 0.1%
Fear 0.1%
Angry 0.1%
Confused 0.1%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Likely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.4%

Captions

Microsoft

a person sitting on a table 75.5%
a person sitting at a table 75.4%
a man and a woman sitting on a table 49.1%

Text analysis

Amazon

KODVK-E.VEEIA