Human Generated Data

Title

Untitled (one man and two women standing and sitting in front of porch)

Date

c. 1905-1915, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5993

Human Generated Data

Title

Untitled (one man and two women standing and sitting in front of porch)

People

Artist: Durette Studio, American 20th century

Date

c. 1905-1915, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5993

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 98.8
Person 98.7
Person 98.5
Person 98.2
People 97.9
Person 97.2
Person 96.9
Person 96.6
Family 94.8
Person 94.3
Person 87.7
Person 85.6
Person 82
Clothing 78.9
Apparel 78.9
Person 66.4
Person 65.7
Suit 59.1
Coat 59.1
Overcoat 59.1
Shoe 56.1
Footwear 56.1
Shoe 53.3

Clarifai
created on 2019-11-16

people 100
group 99.9
group together 99.6
many 98.8
woman 98.4
adult 98.2
child 98
several 97.4
man 96.8
administration 96.2
offspring 92.3
leader 92.2
war 92
family 91.6
five 91.1
recreation 90.8
boy 88.7
room 88.4
four 88.1
street 87.9

Imagga
created on 2019-11-16

man 30.2
kin 30.1
people 26.8
person 22.7
adult 20.9
silhouette 19.9
male 19.3
musical instrument 18
black 18
business 16.4
businessman 14.1
window 14.1
couple 13.9
portrait 12.3
boy 12.2
wind instrument 12.1
men 12
photographer 11.5
urban 11.4
group 11.3
happy 11.3
accordion 11.1
happiness 11
room 10.5
art 10.2
office 9.8
one 9.7
women 9.5
love 9.5
motion 9.4
youth 9.4
professional 9.3
keyboard instrument 9.2
city 9.1
indoor 9.1
fashion 9
teacher 9
romantic 8.9
interior 8.8
world 8.7
dance 8.4
dark 8.4
barbershop 8.3
building 8.2
groom 8.1
body 8
together 7.9
shop 7.9
passenger 7.8
party 7.7
move 7.7
sport 7.5
style 7.4
light 7.4
family 7.1
face 7.1
performer 7.1
to 7.1
travel 7
architecture 7
modern 7
glass 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

building 99.1
clothing 98.3
person 97
window 95.1
suit 94.9
man 93.7
smile 85.4
woman 80.7
black and white 73.5
black 69.9
people 67.7
wedding dress 64.4
text 59.8
bride 53.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 26-40
Gender Female, 54.7%
Angry 45%
Calm 45%
Confused 45%
Disgusted 45%
Surprised 45%
Happy 54.9%
Sad 45%
Fear 45%

AWS Rekognition

Age 45-63
Gender Female, 54.8%
Disgusted 45%
Surprised 45%
Happy 54.9%
Confused 45%
Sad 45%
Fear 45%
Angry 45%
Calm 45.1%

AWS Rekognition

Age 23-35
Gender Male, 54.9%
Angry 45.7%
Fear 45.1%
Disgusted 45.1%
Sad 48.2%
Confused 45.5%
Calm 50.2%
Happy 45.1%
Surprised 45.2%

AWS Rekognition

Age 2-8
Gender Female, 53.4%
Calm 45.7%
Surprised 45%
Sad 53.1%
Happy 45%
Fear 45.2%
Disgusted 45%
Angry 45.6%
Confused 45.4%

AWS Rekognition

Age 0-3
Gender Female, 54.3%
Happy 53.3%
Angry 45.1%
Confused 45.3%
Calm 45.3%
Disgusted 45.2%
Fear 45.1%
Surprised 45.1%
Sad 45.6%

AWS Rekognition

Age 33-49
Gender Female, 54.7%
Sad 45.6%
Disgusted 45.1%
Confused 45.2%
Surprised 45.1%
Angry 45.2%
Calm 53.7%
Happy 45.1%
Fear 45%

AWS Rekognition

Age 19-31
Gender Female, 51.2%
Confused 45.1%
Angry 45.3%
Surprised 45.1%
Calm 52.4%
Disgusted 45.1%
Fear 45.1%
Sad 45.7%
Happy 46%

AWS Rekognition

Age 29-45
Gender Female, 52.6%
Disgusted 45%
Calm 51.6%
Confused 46.1%
Sad 46.2%
Happy 45.1%
Angry 45.8%
Fear 45.1%
Surprised 45.1%

AWS Rekognition

Age 25-39
Gender Male, 53.8%
Calm 52%
Fear 45.1%
Confused 45.2%
Happy 45.3%
Angry 45.3%
Disgusted 45.1%
Sad 46.7%
Surprised 45.2%

AWS Rekognition

Age 23-35
Gender Male, 52.2%
Angry 45.2%
Happy 48.7%
Fear 46.4%
Disgusted 45.2%
Sad 46.7%
Calm 46.5%
Surprised 45.4%
Confused 45.9%

AWS Rekognition

Age 39-57
Gender Male, 54.8%
Surprised 45%
Calm 52.7%
Fear 45%
Disgusted 45%
Happy 45.4%
Angry 45.1%
Sad 46.7%
Confused 45.1%

AWS Rekognition

Age 50-68
Gender Male, 54.6%
Disgusted 45.3%
Happy 45%
Angry 46%
Fear 45.1%
Sad 51.5%
Surprised 45%
Confused 45.3%
Calm 46.7%

Microsoft Cognitive Services

Age 43
Gender Male

Microsoft Cognitive Services

Age 33
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very likely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 98.7%
Shoe 56.1%

Categories