Human Generated Data

Title

Untitled (three young woman in casual dresses posed with studio prop)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12879

Human Generated Data

Title

Untitled (three young woman in casual dresses posed with studio prop)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12879

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.5
Person 99.5
Person 98.7
Person 98.7
People 95.3
Family 92.2
Clothing 90.2
Apparel 90.2
Female 89
Woman 74.2

Clarifai
created on 2019-11-16

people 99.9
portrait 99.3
wear 98.8
adult 98.6
two 98.4
woman 98.1
group 97.3
man 96.6
facial expression 96.4
three 92.1
administration 90.4
outfit 90
musician 88.2
music 87.9
sibling 85.6
four 82.7
actress 81
leader 79.1
singer 78.6
actor 78.5

Imagga
created on 2019-11-16

groom 36.6
people 31.2
adult 30.2
man 27.6
person 26.9
couple 25.3
nurse 24.6
portrait 24
male 23.7
happy 23.2
happiness 20.4
attractive 19.6
love 18.9
business 17
bride 16.5
two 16.1
pretty 16.1
professional 15.3
fashion 15.1
businessman 15
room 14.7
mother 14.6
smiling 14.5
dress 14.5
together 14
smile 13.5
brunette 13.1
standing 13
men 12.9
wedding 12.9
face 12.8
home 12.8
businesswoman 12.7
women 12.7
family 12.5
father 12.2
cheerful 12.2
sexy 12.1
office 12
lifestyle 11.6
parent 11.2
youth 11.1
child 10.9
suit 10.8
dad 10.6
married 10.5
one 10.5
businesspeople 10.4
relationship 10.3
executive 10.3
romantic 9.8
posing 9.8
black 9.7
style 9.6
kin 9.6
hair 9.5
corporate 9.5
life 9.3
elegance 9.2
holding 9.1
lady 8.9
group 8.9
bridal 8.8
husband 8.8
clothing 8.6
sitting 8.6
wife 8.5
bouquet 8.5
meeting 8.5
indoor 8.2
world 8.2
looking 8
job 8
indoors 7.9
work 7.8
boy 7.8
hugging 7.8
color 7.8
colleagues 7.8
expression 7.7
formal 7.6
hand 7.6
manager 7.5
20s 7.3
successful 7.3
sensuality 7.3
interior 7.1
working 7.1
model 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

person 99.5
human face 99.4
clothing 99.1
smile 98.9
wall 97.5
text 95.2
woman 87.4
black and white 70.2
posing 38.1
picture frame 6.1

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 9-19
Gender Female, 52.3%
Fear 0.7%
Confused 0.7%
Happy 1.6%
Angry 1.8%
Disgusted 0.3%
Calm 58.6%
Surprised 0.3%
Sad 35.9%

AWS Rekognition

Age 22-34
Gender Female, 96.9%
Fear 1%
Surprised 0.1%
Confused 0.5%
Sad 83.5%
Disgusted 0.1%
Happy 0.1%
Calm 14.1%
Angry 0.7%

AWS Rekognition

Age 17-29
Gender Male, 54.2%
Disgusted 0%
Angry 0.2%
Fear 0.1%
Happy 0%
Confused 0%
Sad 15.6%
Calm 84.1%
Surprised 0%

Microsoft Cognitive Services

Age 19
Gender Female

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 19
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%

Categories

Imagga

people portraits 91.2%
events parties 8.6%