Human Generated Data

Title

Untitled (portrait of children standing and seated against painted backdrop )

Date

1938

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4064

Human Generated Data

Title

Untitled (portrait of children standing and seated against painted backdrop )

People

Artist: Durette Studio, American 20th century

Date

1938

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4064

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 99.7
Person 99.7
Apparel 99.4
Clothing 99.4
Person 99.2
Person 98.8
Dress 98.5
Person 97.9
Person 97.5
Footwear 97.4
Shoe 97.4
Person 96.1
People 93.4
Female 85.4
Girl 69
Family 66.2
Photography 64.3
Photo 64.3
Face 64.3
Portrait 64.3
Woman 63.1

Clarifai
created on 2019-06-01

people 99.9
group 98.7
group together 97.5
adult 95.3
woman 94.8
man 94.6
child 94.6
many 89.9
wear 88.7
several 88.4
administration 87.3
family 86.9
room 84.4
five 84.3
four 84.2
leader 83.9
indoors 82.4
monochrome 82.1
home 82
three 81.1

Imagga
created on 2019-06-01

brass 39.1
wind instrument 32.8
musical instrument 24.1
people 23.4
person 22.5
kin 20.3
male 19.1
man 18.8
player 16.8
black 15
cornet 15
sport 14.2
men 13.7
dress 13.5
adult 13.4
contestant 13
human 12.7
statue 12.7
art 12.1
golfer 12.1
active 11.7
portrait 11.6
silhouette 11.6
family 10.7
couple 10.4
old 10.4
nurse 10.3
trombone 10.1
bride 9.8
businessman 9.7
group 9.7
sculpture 9.7
happiness 9.4
outdoor 9.2
business 9.1
love 8.7
summer 8.4
fashion 8.3
exercise 8.2
happy 8.1
fitness 8.1
sexy 8
women 7.9
child 7.9
face 7.8
architecture 7.8
mother 7.8
marble 7.7
travel 7.7
culture 7.7
attractive 7.7
two 7.6
athlete 7.6
stone 7.6
life 7.6
fun 7.5
leisure 7.5
symbol 7.4
wedding 7.4
lady 7.3
suit 7.2
body 7.2
activity 7.2
team 7.2
history 7.2
groom 7.1
to 7.1

Google
created on 2019-06-01

Photograph 96.6
Snapshot 87.2
Standing 84.2
Black-and-white 68.3
Room 65.7
Photography 62.4
Stock photography 62.1
Style 52.5
Family 51.8
Pattern 50.5
Art 50.2

Microsoft
created on 2019-06-01

dress 96.6
clothing 95.2
person 93.4
woman 89.9
posing 81.4
smile 79.7
footwear 78.8
old 68.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 51.4%
Sad 47.7%
Surprised 45.2%
Happy 45.2%
Angry 45.2%
Calm 51.4%
Confused 45.2%
Disgusted 45%

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Sad 45.5%
Angry 45.6%
Happy 48.4%
Confused 45.6%
Calm 48.9%
Surprised 45.7%
Disgusted 45.3%

AWS Rekognition

Age 26-43
Gender Male, 51.5%
Disgusted 45%
Calm 54.4%
Angry 45.1%
Sad 45.2%
Happy 45.2%
Surprised 45.1%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Male, 54.6%
Surprised 45.4%
Calm 52.8%
Disgusted 45.2%
Confused 45.2%
Sad 45.4%
Happy 45.8%
Angry 45.2%

AWS Rekognition

Age 26-43
Gender Male, 53.8%
Disgusted 45.3%
Angry 45.9%
Confused 45.4%
Sad 46.3%
Happy 49.2%
Calm 47%
Surprised 45.8%

Feature analysis

Amazon

Person 99.7%
Shoe 97.4%

Categories

Text analysis

Amazon

yavye