Human Generated Data

Title

Untitled (studio portrait of group of nine adults and children)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5995

Human Generated Data

Title

Untitled (studio portrait of group of nine adults and children)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5995

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.6
Human 99.6

Clarifai
created on 2019-11-16

people 99.2
portrait 97.2
adult 96.7
music 96.7
monochrome 95.2
girl 95.2
woman 94.1
one 94
musician 91
art 90.4
concert 89.6
light 89.1
dark 87.8
performance 87.5
man 87.2
movie 86.4
dancer 85
stage 84.7
theater 84.7
shadow 84.4

Imagga
created on 2019-11-16

blackboard 37.3
person 27.3
black 26.4
people 22.9
model 21.8
man 20.9
adult 20.2
fashion 18.8
attractive 18.2
lady 17.9
body 17.6
portrait 17.5
male 17
sexy 16.9
pretty 16.8
kin 15.5
human 15
expression 14.5
face 14.2
love 14.2
one 14.2
silhouette 14.1
dark 13.4
world 13.1
hair 12.7
brunette 12.2
light 12
style 11.9
elegance 11.8
passion 11.3
shadow 10.8
studio 10.6
emotion 10.1
happy 10
make 10
art 9.9
looking 9.6
sensual 9.1
sensuality 9.1
window 9
posing 8.9
boy 8.8
women 8.7
dance 8.7
lifestyle 8.7
elegant 8.6
skin 8.5
makeup 8.2
child 8.1
night 8
couple 7.8
hands 7.8
dancer 7.7
youth 7.7
seductive 7.7
erotic 7.6
lighting 7.5
screen 7.4
slim 7.4
business 7.3
fitness 7.2
cute 7.2
spotlight 7.1
businessman 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.2
person 90
black 79.2
black and white 76.6
clothing 76.5
picture frame 9.4

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Male, 79.5%
Calm 44.6%
Disgusted 0.9%
Sad 18%
Happy 0.1%
Angry 33.9%
Fear 0.4%
Surprised 0.4%
Confused 1.7%

AWS Rekognition

Age 4-14
Gender Male, 54.8%
Angry 45.3%
Happy 45%
Disgusted 45%
Calm 46.3%
Fear 46.2%
Surprised 45.1%
Sad 51.9%
Confused 45%

AWS Rekognition

Age 13-25
Gender Female, 51.7%
Fear 54.7%
Calm 45%
Sad 45.2%
Surprised 45%
Disgusted 45%
Angry 45%
Happy 45%
Confused 45%

AWS Rekognition

Age 16-28
Gender Female, 53.3%
Angry 45%
Confused 45%
Disgusted 45%
Happy 45%
Sad 45.6%
Fear 45.1%
Calm 54.2%
Surprised 45%

AWS Rekognition

Age 23-37
Gender Male, 50.2%
Sad 46.9%
Surprised 45.1%
Angry 45.1%
Confused 45.2%
Fear 45.1%
Happy 45.1%
Calm 52.5%
Disgusted 45.1%

AWS Rekognition

Age 30-46
Gender Female, 53.8%
Sad 45.3%
Calm 54.2%
Angry 45%
Fear 45%
Happy 45.3%
Confused 45.1%
Surprised 45%
Disgusted 45.1%

AWS Rekognition

Age 37-55
Gender Male, 54.7%
Calm 53.4%
Sad 46.2%
Fear 45%
Disgusted 45%
Confused 45.1%
Surprised 45%
Happy 45%
Angry 45.2%

AWS Rekognition

Age 23-37
Gender Male, 54%
Disgusted 45.1%
Calm 46.1%
Surprised 45.1%
Fear 45.3%
Happy 45%
Confused 45.1%
Sad 45.2%
Angry 53%

AWS Rekognition

Age 20-32
Gender Male, 53.6%
Confused 45%
Surprised 45%
Happy 45%
Sad 46.1%
Disgusted 45%
Angry 45.2%
Calm 53.5%
Fear 45.1%

AWS Rekognition

Age 13-23
Gender Male, 52.9%
Happy 45%
Fear 45.1%
Angry 45.7%
Confused 45.1%
Calm 47%
Disgusted 45%
Surprised 45%
Sad 52%

Microsoft Cognitive Services

Age 23
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Possible
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%

Categories