Human Generated Data

Title

Untitled (portrait of a painter)

Date

c. 1920

People

Artist: Nicholas Ház, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, 2.2002.1843

Human Generated Data

Title

Untitled (portrait of a painter)

People

Artist: Nicholas Ház, American 20th century

Date

c. 1920

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Person 98.9
Human 98.9
Man 83.2
Face 76.1
Clothing 72.3
Apparel 72.3
Portrait 64.4
Photography 64.4
Photo 64.4
Finger 59.8

Imagga
created on 2021-12-14

child 54
person 38.9
home 36.7
adult 34.4
male 34.4
people 31.3
indoors 29.9
man 29.6
happy 28.9
sitting 28.4
smiling 25.3
portrait 20.7
family 20.5
casual 20.4
couch 20.3
mother 20.2
attractive 18.9
happiness 18.8
lifestyle 18.8
parent 18.5
indoor 18.3
father 18.2
juvenile 17.5
sofa 17.3
smile 17.1
looking 16.8
face 16.4
color 16.1
dad 16.1
handsome 16.1
love 15.8
couple 14.8
baby 14.8
hospital 14.5
son 14.4
horizontal 14.3
room 14.1
mature 14
boy 13.9
camera 13.9
computer 13.8
clothing 13.1
20s 12.8
relaxing 12.7
one 12.7
half length 12.7
30s 12.5
relaxed 12.2
pretty 11.9
adolescent 11.6
mid adult 11.6
kid 11.5
together 11.4
cheerful 11.4
adults 11.4
laptop 11.3
professional 11.2
expression 11.1
children 10.9
house 10.9
husband 10.8
holding 10.7
daughter 10.5
reading 10.5
females 10.4
business 10.3
hair 10.3
neonate 10.1
cute 10.1
confident 10
middle aged 9.7
women 9.5
men 9.5
day 9.4
relaxation 9.2
inside 9.2
lady 8.9
patient 8.9
office 8.8
40s 8.8
thirties 8.8
brunette 8.7
only 8.6
bed 8.6
wife 8.5
living 8.5
sit 8.5
senior 8.4
domestic 8.3
care 8.2
single 8.2
alone 8.2
businesswoman 8.2
businessman 8
life 7.9
bright 7.9
living room 7.8
affectionate 7.8
corporate 7.7
affection 7.7
looking camera 7.7
health 7.7
two 7.6
clothes 7.5
interior 7.1

Microsoft
created on 2021-12-14

person 99.7
text 98.3
human face 96.6
clothing 91.7
sketch 83.4
drawing 83.3
portrait 78.8
man 76.1
black and white 57.9
dish 57.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Male, 96.3%
Calm 97.5%
Sad 1%
Angry 0.8%
Surprised 0.4%
Confused 0.3%
Happy 0%
Fear 0%
Disgusted 0%

Microsoft Cognitive Services

Age 32
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.9%

Captions

Microsoft

a person sitting on a table 78.2%
a person sitting in front of a laptop 68.2%
a person sitting at a table 68.1%