Unable to open [object Object]: HTTP 0 attempting to load TileSource

Human Generated Data

Title

Untitled (Pinkerton Academy students, Derry, N.H. 1916)

Date

c.1916

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3942

Human Generated Data

Title

Untitled (Pinkerton Academy students, Derry, N.H. 1916)

People

Artist: Durette Studio, American 20th century

Date

c.1916

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3942

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.5
Human 99.5
Person 99.4
Person 98.3
Person 96
Person 92.9
Crowd 92.7
Person 91.9
Person 88.2
Person 87.1
People 86.7
Person 85.9
Audience 84.9
Person 81.4
Person 79.9
Person 68.2
Funeral 67.9
Person 65.9
Person 56.4
Speech 55.5

Clarifai
created on 2019-06-01

people 99.6
child 95.4
many 95.3
group together 95.1
monochrome 94.4
woman 92.4
group 92.3
crowd 90.4
adult 88.8
education 86.6
man 86.1
school 84.7
wear 75.7
uniform 75.3
boy 74.3
portrait 73.9
leader 67.6
adolescent 65
desktop 62.6
street 60.9

Imagga
created on 2019-06-01

marimba 95.1
percussion instrument 76.9
musical instrument 59.7
people 26.8
man 18.2
business 17.6
city 16.6
winter 16.2
history 16.1
architecture 15.6
male 15.6
old 14.6
room 14.3
adult 13.8
group 13.7
businessman 13.2
men 12.9
travel 12.7
women 12.6
person 12.4
snow 11.9
office 11.4
classroom 11
outfit 10.9
tourism 10.7
human 10.5
scene 10.4
church 10.2
building 9.9
urban 9.6
professional 9.4
team 8.9
job 8.8
crowd 8.6
day 8.6
corporate 8.6
walking 8.5
historical 8.5
black 8.4
modern 8.4
religion 8.1
indoors 7.9
life 7.9
gymnasium 7.9
color 7.8
portrait 7.8
cold 7.7
outdoors 7.5
historic 7.3
worker 7.1
family 7.1
interior 7.1
working 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

person 96.7
posing 89.6
clothing 88.8
window 88.3
group 86.1
white 60.7
old 55.4
team 31.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 52.5%
Disgusted 45.3%
Angry 46.2%
Confused 45.9%
Sad 50.7%
Happy 45.5%
Calm 46%
Surprised 45.5%

Feature analysis

Amazon

Person
Person 99.5%

Categories

Imagga

paintings art 92.2%
interior objects 7.7%

Text analysis

Amazon

116
Ooeeeah