Human Generated Data

Title

Untitled (group portrait of thirteen female children inside home)

Date

1927, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12325

Human Generated Data

Title

Untitled (group portrait of thirteen female children inside home)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1927, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.12325

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Person 99.6
Human 99.6
Person 99.1
Person 98.9
Person 98.7
Person 98.3
Person 97.7
Person 97.3
Person 97.2
Person 96.8
Person 96.2
Person 94.3
Person 92.7
Poster 86.4
Advertisement 86.4
People 86.2
Person 80.6
Indoors 77.1
Room 77.1
Interior Design 75.5
Screen 61.5
Electronics 61.5
School 59.9
Classroom 59.9
Apparel 59.7
Clothing 59.7
Shorts 58.2

Clarifai
created on 2019-11-16

people 99.9
group 99.2
group together 98
woman 95.9
many 95.7
adult 95.2
man 94
uniform 92.5
wear 91.8
child 90.1
several 86.9
outfit 86.6
music 84.8
indoors 84.5
room 83.8
education 82.3
portrait 79.5
musician 78.2
five 78.2
furniture 77.2

Imagga
created on 2019-11-16

kin 67.5
man 29.5
people 26.8
male 24.2
adult 20.3
person 18.7
happy 17.5
couple 17.4
family 16.9
child 16.9
world 16.1
happiness 14.9
room 14.4
love 14.2
black 13.8
smiling 13.7
together 12.3
mother 12
men 12
home 12
youth 11.9
portrait 11.6
barbershop 11.4
lifestyle 10.8
silhouette 10.8
interior 10.6
group 10.5
friends 10.3
women 10.3
two 10.2
classroom 9.9
cheerful 9.7
business 9.7
businessman 9.7
window 9.4
shop 9.1
fun 9
boy 8.7
husband 8.6
percussion instrument 8.5
teen 8.3
building 8.2
indoor 8.2
musical instrument 8.2
office 8
sexy 8
kid 8
working 7.9
holiday 7.9
sitting 7.7
outdoor 7.6
casual 7.6
studio 7.6
wife 7.6
togetherness 7.5
friendship 7.5
leisure 7.5
holding 7.4
light 7.3
teenager 7.3
girls 7.3
dress 7.2
looking 7.2
mercantile establishment 7.2
handsome 7.1
smile 7.1
summer 7.1
indoors 7
modern 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

clothing 99.2
posing 96.2
window 95.6
wall 95
smile 95
boy 94.7
person 93.7
human face 91.9
text 88.8
old 81.8
child 81.3
black 72.4
school 62.7
footwear 58.3
group 56.1
picture frame 10.3

Color Analysis

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 12-22
Gender Female, 53.7%
Disgusted 45.6%
Angry 45.4%
Calm 45%
Surprised 45.2%
Sad 45.2%
Confused 45.2%
Fear 49.4%
Happy 49%

AWS Rekognition

Age 10-20
Gender Male, 52.2%
Calm 45.2%
Sad 45%
Happy 54.8%
Angry 45%
Fear 45%
Surprised 45%
Confused 45%
Disgusted 45%

AWS Rekognition

Age 10-20
Gender Female, 54.8%
Happy 45%
Angry 45.8%
Disgusted 45%
Calm 53.2%
Fear 45%
Surprised 45%
Confused 45.1%
Sad 45.8%

AWS Rekognition

Age 16-28
Gender Female, 53.7%
Disgusted 45.1%
Angry 45.2%
Fear 45.4%
Happy 45.1%
Confused 45.4%
Sad 45.2%
Calm 52%
Surprised 46.7%

AWS Rekognition

Age 20-32
Gender Female, 53.1%
Angry 45.1%
Happy 45.3%
Disgusted 45.6%
Surprised 45.1%
Fear 45%
Sad 45.1%
Confused 45.1%
Calm 53.7%

AWS Rekognition

Age 16-28
Gender Female, 54.6%
Happy 45.1%
Disgusted 45%
Sad 45.3%
Angry 45%
Surprised 45%
Fear 45%
Calm 54.6%
Confused 45%

AWS Rekognition

Age 6-16
Gender Male, 52.9%
Angry 45.2%
Disgusted 45.1%
Happy 45.1%
Calm 52.9%
Sad 46.4%
Surprised 45.1%
Fear 45.1%
Confused 45.1%

AWS Rekognition

Age 3-11
Gender Male, 53%
Happy 45%
Angry 45.4%
Confused 46.2%
Calm 50.2%
Disgusted 45.2%
Fear 45.7%
Surprised 45.4%
Sad 46.8%

AWS Rekognition

Age 3-9
Gender Male, 53.7%
Confused 45.2%
Calm 50.5%
Surprised 45.3%
Fear 45.6%
Happy 45.3%
Angry 46.4%
Sad 46.7%
Disgusted 45.1%

AWS Rekognition

Age 4-14
Gender Male, 53.6%
Angry 54.3%
Disgusted 45%
Surprised 45%
Sad 45.6%
Confused 45%
Calm 45%
Fear 45%
Happy 45%

AWS Rekognition

Age 13-25
Gender Female, 50.7%
Calm 54.3%
Sad 45.2%
Happy 45%
Angry 45.4%
Fear 45%
Confused 45.1%
Surprised 45%
Disgusted 45%

AWS Rekognition

Age 22-34
Gender Female, 54.4%
Surprised 45.2%
Sad 45.5%
Confused 45.2%
Happy 45.1%
Disgusted 45.1%
Fear 45.2%
Angry 45.2%
Calm 53.6%

AWS Rekognition

Age 21-33
Gender Female, 50.5%
Happy 45%
Sad 45.1%
Disgusted 45.1%
Calm 51.4%
Angry 47.9%
Surprised 45.3%
Fear 45%
Confused 45.1%

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 8
Gender Female

Microsoft Cognitive Services

Age 25
Gender Female

Microsoft Cognitive Services

Age 19
Gender Female

Microsoft Cognitive Services

Age 23
Gender Female

Microsoft Cognitive Services

Age 22
Gender Female

Microsoft Cognitive Services

Age 21
Gender Female

Microsoft Cognitive Services

Age 21
Gender Female

Microsoft Cognitive Services

Age 21
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 26
Gender Female

Microsoft Cognitive Services

Age 17
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 99.6%
Poster 86.4%

Categories