Human Generated Data

Title

Untitled (two women, man, and boy in white jumpsuit seated in living room by column)

Date

c. 1940, printed later

People

Artist: Paul Gittings, American 1900 - 1988

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12835

Human Generated Data

Title

Untitled (two women, man, and boy in white jumpsuit seated in living room by column)

People

Artist: Paul Gittings, American 1900 - 1988

Date

c. 1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12835

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.5
Person 99.5
Person 99
Person 96.2
Person 95.8
People 93.3
Apparel 90.4
Clothing 90.4
Family 82
Tie 76.2
Accessory 76.2
Accessories 76.2
Overcoat 61.9
Coat 61.9
Shorts 58.9
Performer 56.8

Clarifai
created on 2019-11-16

people 99.9
group 98.7
woman 97.9
adult 97.7
man 96.8
portrait 96
group together 94.5
child 91.6
wear 91.2
administration 90.5
leader 90.4
several 87.2
four 86.3
five 86
three 83.2
music 82.8
family 82.7
two 80.6
facial expression 79.3
wedding 77.5

Imagga
created on 2019-11-16

person 31.3
kin 31.1
people 30.1
man 28.9
male 26
adult 24
dark 20.9
portrait 20.7
black 19.9
couple 19.2
love 18.9
one 18.7
attractive 18.2
passion 16.9
sexy 16.1
happiness 15.7
happy 15
body 14.4
fashion 14.3
posing 14.2
model 14
lady 13.8
sensuality 13.6
human 13.5
world 13.1
lifestyle 13
elevator 12.9
sensual 12.7
style 12.6
pretty 12.6
erotic 12.3
looking 12
romance 11.6
family 11.6
room 11.4
silhouette 10.8
romantic 10.7
lifting device 10.3
sitting 10.3
men 10.3
hair 10.3
two 10.2
dress 9.9
passionate 9.8
night 9.8
together 9.6
seductive 9.6
relationship 9.4
youth 9.4
smile 9.3
studio 9.1
suit 9
device 8.9
performer 8.9
clothing 8.7
boy 8.7
water 8.7
business 8.5
togetherness 8.5
modern 8.4
old 8.4
vintage 8.3
wet 8
light 8
home 8
dancer 7.9
spectator 7.7
girlfriend 7.7
expression 7.7
dance 7.6
wife 7.6
fun 7.5
leisure 7.5
group 7.3
pose 7.2
handsome 7.1
cool 7.1
face 7.1
interior 7.1
businessman 7.1
golfer 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 99.5
human face 98.8
smile 98
person 96.4
man 92.7
standing 90.9
posing 79.5
black and white 79.2
black 75.8
white 75.6
text 72.7

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 22-34
Gender Female, 99.1%
Confused 0.8%
Surprised 0.5%
Sad 2.6%
Calm 1.8%
Disgusted 1.1%
Happy 91.4%
Fear 1%
Angry 0.7%

AWS Rekognition

Age 5-15
Gender Male, 90.3%
Angry 0%
Happy 97.2%
Disgusted 0%
Sad 0.1%
Calm 2.5%
Surprised 0.1%
Confused 0%
Fear 0%

AWS Rekognition

Age 39-57
Gender Female, 98%
Angry 10.1%
Sad 2.4%
Confused 4.7%
Disgusted 6.1%
Surprised 2.3%
Happy 23.8%
Fear 1.7%
Calm 48.8%

AWS Rekognition

Age 41-59
Gender Male, 99%
Disgusted 0.1%
Happy 0%
Angry 0.1%
Fear 0%
Sad 4.9%
Surprised 0.1%
Confused 0.3%
Calm 94.5%

Microsoft Cognitive Services

Age 30
Gender Female

Microsoft Cognitive Services

Age 58
Gender Female

Microsoft Cognitive Services

Age 61
Gender Male

Microsoft Cognitive Services

Age 9
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Possible
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.5%
Tie 76.2%

Categories

Text analysis

Amazon

TX0 664

Google

TXD 6 6 4
TXD
4
6