Human Generated Data

Title

Untitled (two photographs: studio portrait of two women seated around man standing in high collared jacket; young boy and girl standing)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6068

Human Generated Data

Title

Untitled (two photographs: studio portrait of two women seated around man standing in high collared jacket; young boy and girl standing)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6068

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Person 99.7
Human 99.7
Apparel 99.6
Clothing 99.6
Person 99.1
Person 98.2
Person 97.7
Person 97.4
Coat 93.3
Suit 93.2
Overcoat 93.2
Tie 70.4
Accessories 70.4
Accessory 70.4
Footwear 61.5
Shoe 61.5
Sleeve 58.5
Pants 57.7
Jacket 57.4
Man 57.1
Stage 57.1
Door 56.8

Clarifai
created on 2019-05-30

people 99.9
group 99.4
group together 98.2
wear 97.2
man 95.8
adult 95.2
administration 95
many 93.9
woman 92.8
outfit 92.2
leader 91.8
room 90.7
music 90.6
several 85.1
musician 80.8
actor 80
indoors 80
five 76.6
military 73.9
furniture 72.1

Imagga
created on 2019-05-30

man 28.2
people 24.5
adult 23.8
business 21.3
male 19.9
person 17.8
office 17.1
businessman 16.8
fashion 16.6
urban 14
black 14
window 13.8
couple 13.1
life 12.5
musical instrument 12.4
outfit 12.1
groom 12.1
corporate 12
suit 12
professional 12
women 11.9
portrait 11.6
city 11.6
silhouette 11.6
lifestyle 11.6
happy 11.3
love 11
executive 10.9
attractive 10.5
group 10.5
walking 10.4
happiness 10.2
indoor 10
job 9.7
interior 9.7
style 9.6
men 9.4
wind instrument 9.3
model 9.3
two 9.3
clothing 9.3
human 9
world 9
sexy 8.8
body 8.8
together 8.8
standing 8.7
building 8.7
sitting 8.6
walk 8.6
togetherness 8.5
elegance 8.4
dark 8.3
kin 8.2
posing 8
room 8
modern 7.7
youth 7.7
boss 7.6
casual 7.6
one 7.5
manager 7.4
future 7.4
light 7.4
occupation 7.3
businesswoman 7.3
sensuality 7.3
looking 7.2
hair 7.1
jacket 7.1
working 7.1

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

clothing 98.4
man 93.6
footwear 90.7
person 89.9
standing 87.7
black and white 79.8
coat 75.1
suit 73.6
posing 66.9

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 11-18
Gender Male, 52.9%
Surprised 45%
Disgusted 45%
Angry 45.1%
Confused 45.1%
Sad 45.1%
Happy 45%
Calm 54.7%

AWS Rekognition

Age 11-18
Gender Female, 54.3%
Sad 45.8%
Calm 52.2%
Angry 45.5%
Happy 45.4%
Disgusted 45.5%
Confused 45.5%
Surprised 45.1%

AWS Rekognition

Age 27-44
Gender Male, 53.7%
Confused 45.4%
Happy 45.1%
Surprised 45.2%
Calm 50.8%
Sad 47%
Angry 46.2%
Disgusted 45.3%

AWS Rekognition

Age 26-43
Gender Male, 54.7%
Happy 45%
Angry 45.2%
Sad 54.5%
Disgusted 45%
Calm 45.1%
Confused 45.2%
Surprised 45%

AWS Rekognition

Age 30-47
Gender Female, 54.7%
Happy 45.4%
Confused 45.1%
Sad 52.4%
Calm 46.6%
Surprised 45.1%
Disgusted 45.1%
Angry 45.3%

Microsoft Cognitive Services

Age 29
Gender Male

Microsoft Cognitive Services

Age 34
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.7%
Coat 93.3%
Tie 70.4%
Shoe 61.5%

Categories

Imagga

interior objects 99.5%