Human Generated Data

Title

Untitled (studio portrait of child in chair wearing knit coat and hat)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6032

Human Generated Data

Title

Untitled (studio portrait of child in chair wearing knit coat and hat)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6032

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 98.7
Human 98.7
Person 97.9
Clothing 93.5
Apparel 93.5
Person 91.3
Shop 77.4
Coat 68.9
Monitor 67
Screen 67
Electronics 67
Display 67
Art 66.2
Overcoat 66.1
Window Display 63.4
Long Sleeve 60.9
Sleeve 60.9
Text 60.3

Clarifai
created on 2019-11-16

people 99.9
adult 97.9
two 97.5
man 96.8
group 96.5
woman 95.3
wear 94.1
child 91.7
movie 91
actor 90.3
three 90.2
leader 89.5
outfit 89.4
administration 86.9
actress 84.9
family 83.3
four 82.5
several 82.2
group together 82.2
one 81

Imagga
created on 2019-11-16

black 17.7
person 17.6
people 16.2
statue 15.8
art 15.5
man 15.5
male 14.2
silhouette 14.1
old 13.9
symbol 12.8
event 12
drawing 11.1
building 10.7
monument 10.3
grunge 10.2
sculpture 10.1
competition 10.1
sport 10
city 10
adult 9.8
vintage 9.1
player 9
history 8.9
design 8.7
antique 8.7
crowd 8.6
nation 8.5
park 8.3
light 8
flag 7.9
icon 7.9
vibrant 7.9
nighttime 7.8
architecture 7.8
portrait 7.8
match 7.7
famous 7.4
lights 7.4
pose 7.2
metal 7.2
religion 7.2
bright 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.8
clothing 95.3
person 92.4
drawing 89.7
cartoon 86.6
man 81.8
black and white 68.4
sketch 63.1
footwear 53
picture frame 11.8

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 1-5
Gender Female, 98.2%
Fear 0%
Calm 0%
Disgusted 0%
Angry 0%
Surprised 0%
Happy 100%
Sad 0%
Confused 0%

AWS Rekognition

Age 6-16
Gender Male, 52.1%
Happy 45%
Disgusted 45%
Angry 55%
Fear 45%
Calm 45%
Surprised 45%
Sad 45%
Confused 45%

AWS Rekognition

Age 8-18
Gender Female, 52.3%
Fear 45.1%
Angry 46.7%
Calm 51.8%
Surprised 45.3%
Happy 45%
Confused 45.3%
Sad 45.6%
Disgusted 45%

AWS Rekognition

Age 26-42
Gender Female, 50%
Disgusted 49.5%
Confused 49.5%
Angry 50.3%
Happy 49.5%
Sad 49.6%
Calm 49.5%
Surprised 49.5%
Fear 49.5%

AWS Rekognition

Age 28-44
Gender Male, 50.2%
Happy 49.6%
Confused 49.5%
Calm 50%
Fear 49.6%
Angry 49.5%
Sad 49.6%
Surprised 49.5%
Disgusted 49.6%

AWS Rekognition

Age 17-29
Gender Male, 50.3%
Fear 49.5%
Surprised 49.5%
Angry 49.5%
Sad 49.6%
Disgusted 49.5%
Happy 49.5%
Calm 50.4%
Confused 49.5%

Microsoft Cognitive Services

Age 3
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very likely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Monitor 67%

Categories

Imagga

paintings art 99.7%