Human Generated Data

Title

Untitled (portrait of women graduates with matching laurels, capes, and tassels holding diplomas)

Date

1935

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4107

Human Generated Data

Title

Untitled (portrait of women graduates with matching laurels, capes, and tassels holding diplomas)

People

Artist: Durette Studio, American 20th century

Date

1935

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4107

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Clothing 99.6
Apparel 99.6
Person 99.6
Human 99.6
Person 99.5
Person 99.3
Person 99.1
Person 98.9
Person 98.8
Person 98.7
Person 98.7
Person 98.5
Person 97.8
Person 97.5
Suit 96.1
Overcoat 96.1
Coat 96.1
Crowd 91.7
Audience 91.7
Tuxedo 76.3
People 74.4
Speech 68.8
Person 65.2
Photography 61.4
Portrait 61.4
Face 61.4
Photo 61.4
Furniture 59.9
Fashion 57.9
Robe 57.9

Clarifai
created on 2019-06-01

people 99.8
group 99.1
group together 97.9
man 97.4
adult 96.7
many 95.4
woman 95
wear 91.9
crowd 91.4
retro 86.2
leader 83.4
several 83.4
vintage 79.9
portrait 79.6
partnership 79.2
uniform 76.1
boy 74
child 70.3
outfit 70.2
school 69.3

Imagga
created on 2019-06-01

fountain 58.4
structure 45.8
architecture 29.7
building 21.3
city 19.9
travel 16.9
landmark 16.2
history 16.1
stone 15.5
scene 14.7
tourism 14
monument 14
statue 13.4
old 12.5
world 12.3
people 11.7
black 11.4
water 11.3
marimba 11.2
picket fence 11.1
tourist 10.9
sculpture 10.6
group 10.5
landscape 10.4
column 10.3
park 10.1
fence 9.8
waterfall 9.7
crowd 9.6
sky 9.6
percussion instrument 9.5
men 9.4
historic 9.2
musical instrument 8.9
cathedral 8.7
ancient 8.6
day 8.6
house 8.5
memorial 8.5
hall 8.5
winter 8.5
art 8.5
town 8.3
silhouette 8.3
outdoors 8.2
new 8.1
barrier 8.1
river 8
marble 7.9
military 7.7
famous 7.4
window 7.3
national 7.2
scenery 7.2
baron 7.1
scenic 7
groom 7

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

person 97.5
clothing 92.9
black and white 63.8
man 62.7
old 62.6
white 61.1
group 56.8
lined 28.7
several 13.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 53.9%
Angry 45.3%
Sad 46.8%
Disgusted 45.1%
Surprised 45.2%
Happy 47.3%
Calm 50%
Confused 45.2%

AWS Rekognition

Age 35-52
Gender Female, 50.7%
Sad 51.5%
Confused 45.6%
Disgusted 45.2%
Surprised 45.3%
Angry 45.4%
Happy 45.2%
Calm 46.8%

AWS Rekognition

Age 26-43
Gender Female, 54.2%
Angry 45.1%
Calm 45.4%
Sad 45.1%
Surprised 45.1%
Disgusted 45.1%
Happy 54.1%
Confused 45.1%

AWS Rekognition

Age 26-43
Gender Female, 52.4%
Angry 45.8%
Surprised 45.5%
Calm 46.9%
Sad 47%
Confused 45.4%
Disgusted 45.3%
Happy 49.1%

AWS Rekognition

Age 26-43
Gender Female, 51.1%
Happy 52.4%
Confused 45.2%
Disgusted 45.1%
Sad 45.5%
Calm 46.4%
Angry 45.2%
Surprised 45.2%

AWS Rekognition

Age 20-38
Gender Female, 50.8%
Sad 48.7%
Angry 45.5%
Surprised 45.3%
Calm 47.8%
Disgusted 45.3%
Happy 47.1%
Confused 45.3%

AWS Rekognition

Age 26-43
Gender Female, 51.2%
Sad 45.6%
Happy 48.9%
Confused 45.2%
Angry 45.3%
Surprised 45.2%
Disgusted 45.2%
Calm 49.5%

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Amazon

9420000
J