Human Generated Data

Title

Untitled (studio portrait of man seated holding violin and bow)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6024

Human Generated Data

Title

Untitled (studio portrait of man seated holding violin and bow)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6024

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99
Person 99
Person 98.4
Musical Instrument 75.1
Musician 75.1
Apparel 74.7
Clothing 74.7
Plant 74.7
Blossom 74.7
Flower 74.7
Suit 72.8
Overcoat 72.8
Coat 72.8
Flower Arrangement 61.8
Vase 61.6
Pottery 61.6
Jar 61.6
Sleeve 59.9
Leisure Activities 59.6
Long Sleeve 59.1
Tuxedo 57.6
Crowd 56.3
Text 55.6
Flower Bouquet 55.2

Clarifai
created on 2019-11-16

people 99.8
man 96.9
adult 96.7
wear 96.5
group 96
woman 94.5
outfit 94.5
music 94.4
musician 90.6
two 86.3
movie 83.1
collection 82.4
actor 82.3
singer 80.4
military 80.2
administration 78.9
group together 78.1
portrait 75.7
three 75.4
outerwear 75.1

Imagga
created on 2019-11-16

world 40.7
kin 37.4
silhouette 23.2
people 22.9
man 22.2
male 22
black 21.7
person 19.9
musical instrument 19.7
sunset 18.9
adult 18.3
portrait 16.8
accordion 16.1
keyboard instrument 13.9
outdoor 13.8
model 12.4
wind instrument 12.4
sport 12.4
couple 11.3
fashion 11.3
sexy 11.2
body 11.2
women 10.3
lifestyle 10.1
pretty 9.8
attractive 9.8
human 9.8
lady 9.7
sky 9.6
men 9.4
face 9.2
art 9.2
leisure 9.1
hand 9.1
summer 9
bride 8.8
happy 8.8
boy 8.7
love 8.7
happiness 8.6
dusk 8.6
youth 8.5
park 8.2
fun 8.2
style 8.2
pose 8.2
mother 7.5
dark 7.5
friendship 7.5
evening 7.5
statue 7.4
water 7.3
girls 7.3
group 7.3
fitness 7.2
dress 7.2
hair 7.1
clothing 7
together 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 97.2
man 90
person 85.2
posing 84
clothing 82.9
drawing 77.2
black and white 76.5
blackboard 66.7
music 56.6
gallery 53.9
old 48.5
picture frame 14

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 21-33
Gender Male, 54.2%
Angry 45.1%
Fear 45%
Confused 45%
Surprised 45%
Calm 54.5%
Disgusted 45%
Sad 45.3%
Happy 45%

AWS Rekognition

Age 32-48
Gender Male, 92.1%
Surprised 0.9%
Sad 3.3%
Confused 1.5%
Happy 9.6%
Disgusted 4.1%
Fear 0.4%
Angry 10.6%
Calm 69.6%

Microsoft Cognitive Services

Age 50
Gender Male

Microsoft Cognitive Services

Age 29
Gender Male

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99%

Categories