Human Generated Data

Title

Untitled (two photographs: double studio portrait of man with cello; studio portrait of two toddler boys standing in front of two blocks)

Date

1930-1945, printed later

People

Artist: Martin Schweig, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10285

Human Generated Data

Title

Untitled (two photographs: double studio portrait of man with cello; studio portrait of two toddler boys standing in front of two blocks)

People

Artist: Martin Schweig, American 20th century

Date

1930-1945, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.10285

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 98.6
Human 98.6
Person 97.9
Person 96.6
Person 86.1
Cello 82.4
Musical Instrument 82.4
Apparel 71.2
Shoe 71.2
Clothing 71.2
Footwear 71.2
Advertisement 59.3
Poster 59.3
Collage 59.3

Clarifai
created on 2019-11-16

people 99.5
monochrome 97.2
group 95.7
woman 95.5
music 94.1
adult 94
man 93.6
musician 93.1
wear 92.6
indoors 90.8
facial expression 90.1
portrait 89.5
outfit 88.6
child 88
one 86
many 84.4
two 84.3
several 83.3
instrument 83
three 80.6

Imagga
created on 2019-11-16

black 28.6
adult 27.2
person 25.5
portrait 25.2
sexy 24.9
telephone 24
model 23.3
pay-phone 22.1
fashion 21.8
people 21.2
dress 20.8
pretty 20.3
attractive 20.3
style 19.3
electronic equipment 17.7
lady 17
hair 16.6
face 16.3
brunette 15.7
man 14.8
sax 14.5
posing 14.2
one 13.4
studio 12.9
elegance 12.6
call 12.5
equipment 12.3
makeup 11.9
women 11.9
art 11.7
cute 11.5
smile 11.4
elegant 11.1
device 11.1
youth 11.1
sensuality 10.9
make 10.9
gorgeous 10.9
pose 10.9
lifestyle 10.8
male 10.6
clothing 10.5
urban 10.5
looking 10.4
wind instrument 10
dark 10
sensual 10
bride 9.6
body 9.6
standing 9.6
seductive 9.6
window 9.3
musician 9.2
vintage 9.1
happy 8.8
couple 8.7
play 8.6
sitting 8.6
old 8.4
20s 8.2
office 8
interior 8
performer 7.9
luxury 7.7
desire 7.7
expression 7.7
hot 7.5
human 7.5
slim 7.4
figure 7.2
smiling 7.2
stylish 7.2
music 7.2
happiness 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 31-47
Gender Male, 85.4%
Surprised 1%
Angry 0.8%
Fear 0%
Calm 95.8%
Confused 1.8%
Sad 0.2%
Disgusted 0.2%
Happy 0.1%

AWS Rekognition

Age 0-3
Gender Female, 50.7%
Sad 45%
Disgusted 45%
Happy 53.4%
Confused 45%
Fear 45%
Angry 45%
Calm 46.5%
Surprised 45%

AWS Rekognition

Age 0-3
Gender Female, 51.8%
Confused 45%
Calm 45.2%
Sad 45%
Surprised 45%
Happy 54.7%
Disgusted 45%
Fear 45%
Angry 45%

AWS Rekognition

Age 22-34
Gender Male, 54.6%
Disgusted 45%
Fear 45%
Confused 45.4%
Calm 53.7%
Surprised 45.3%
Angry 45.3%
Sad 45.2%
Happy 45%

Microsoft Cognitive Services

Age 39
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.6%
Shoe 71.2%

Categories

Imagga

paintings art 100%

Text analysis

Amazon

a