Human Generated Data

Title

Untitled (woman in riding clothes seated in studio, right elbow on right knee)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12848

Human Generated Data

Title

Untitled (woman in riding clothes seated in studio, right elbow on right knee)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.2
Person 99.2
Apparel 99.1
Clothing 99.1
Sleeve 87.7
Long Sleeve 80
Footwear 79.7
Shoe 74.4
Performer 59.1
Leisure Activities 57.9
Pants 57.2
Table 56
Furniture 56

Clarifai
created on 2019-11-16

people 99.3
portrait 98.7
adult 98
one 97.8
woman 96.7
monochrome 96.6
wear 96.2
actor 95.2
actress 91.8
sit 91.4
indoors 89.6
man 89
music 85.7
sitting 85.6
fashion 84
chair 83.1
furniture 82.6
model 82.3
outfit 81.1
musician 80.8

Imagga
created on 2019-11-16

person 40.3
professional 34.8
business 34
portrait 31.1
adult 30.3
businesswoman 30
people 29
office 24.1
work 23.6
laptop 22.9
corporate 22.4
man 22.2
executive 22.1
attractive 21.7
male 21.3
job 21.2
computer 20.9
smile 20.7
happy 20.1
smiling 19.5
success 19.3
successful 19.2
domestic 18.9
sitting 18.9
women 18.2
working 16.8
pretty 16.1
businessman 15.9
worker 15.7
model 15.6
suit 15.5
confident 15.5
holding 14.9
phone 14.8
secretary 14.6
student 14.5
sexy 14.5
looking 14.4
face 14.2
desk 14.2
performer 14
brunette 13.9
comedian 13.9
sword 13.5
businesspeople 13.3
adolescent 13.2
glasses 13
black 12.9
one 12.7
technology 12.6
communication 12.6
cheerful 12.2
clothing 12.2
studio 12.2
manager 12.1
alone 11.9
happiness 11.8
indoors 11.4
telephone 11.2
weapon 11.1
friendly 11
guy 10.8
juvenile 10.7
notebook 10.6
education 10.4
casual 10.2
nurse 10
bow tie 9.8
fashion 9.8
necktie 9.8
entertainer 9.7
hair 9.5
talking 9.5
employee 9.5
men 9.5
one person 9.4
stylish 9
handsome 8.9
teacher 8.6
workplace 8.6
career 8.5
modern 8.4
shirt 8.4
human 8.3
indoor 8.2
lady 8.1
cute 7.9
call 7.8
elegant 7.7
expression 7.7
boss 7.7
cell 7.7
smart 7.5
pen 7.4
occupation 7.3
20s 7.3
lifestyle 7.2
table 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 99
wall 98.6
clothing 92.3
indoor 88.9
human face 88.8
text 77.2
piano 53.9

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 19-31
Gender Female, 79.7%
Sad 0.8%
Confused 0.1%
Fear 0%
Happy 0.2%
Surprised 0.1%
Angry 0.1%
Calm 98.8%
Disgusted 0%

Microsoft Cognitive Services

Age 28
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.2%

Captions

Microsoft

a man and a woman sitting on a piano 68.4%
a man and a woman sitting at a piano 68.3%
a person sitting at a piano 68.2%