Human Generated Data

Title

Education, Industrial: United States. New York. New York City. Public Schools, Adaptation to Special Needs: New York City Public Schools. Examples of the Adaptation of Education to Special City Needs: Public School No. 37 Manhattan: Housekeeping-Serving.

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.82.2

Human Generated Data

Title

Education, Industrial: United States. New York. New York City. Public Schools, Adaptation to Special Needs: New York City Public Schools. Examples of the Adaptation of Education to Special City Needs: Public School No. 37 Manhattan: Housekeeping-Serving.

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Person 99
Human 99
Person 98.9
Chair 98.4
Furniture 98.4
Person 97.9
Chair 97.9
Chair 95.4
Person 91.9
Person 84.4
Clinic 83.1
Room 83
Indoors 83
Person 77.9
Person 63
People 59.3
School 59.2
Classroom 59.2
Workshop 59.1
Table 57.4
Steamer 56.8
Person 50.1
Person 42.2

Clarifai
created on 2019-06-05

people 99.6
many 98.8
chair 98.4
man 97.8
furniture 96.9
group 96.7
indoors 95.3
room 94.8
adult 94
group together 93.3
seat 90.4
table 90.4
woman 89.3
military 89
crowd 84.5
meeting 83.9
no person 83.2
music 81.5
sit 80.2
recreation 78.7

Imagga
created on 2019-06-05

sketch 100
drawing 83.4
representation 71.3
table 47.1
interior 46
room 44.6
chair 42.1
furniture 34.8
hall 33.6
house 32.6
floor 27
modern 23.8
chairs 23.5
wood 22.5
empty 21.5
architecture 21.1
home 21
design 20.8
decor 20.3
inside 18.4
window 18.3
luxury 18
seat 17.8
restaurant 17.4
dining 17.1
glass 17.1
tables 16.7
indoors 16.7
patio 16
structure 14.2
contemporary 14.1
area 13.5
dinner 13.5
light 13.4
style 13.4
urban 13.1
wall 13
plant 12.7
residential 12.4
hotel 12.4
desk 12.3
decoration 12.2
building 11.8
kitchen 11.6
office 11.2
deck 11
indoor 11
business 10.9
elegance 10.9
nobody 10.9
classroom 10.7
comfortable 10.5
lamp 10.5
resort 10.3
3d 10.1
relaxation 10.1
drink 10
city 10
stool 9.9
stove 9.8
tile 9.7
food 9.7
living 9.5
lunch 9.4
eat 9.2
travel 9.2
counter 8.8
scene 8.7
apartment 8.6
relax 8.4
people 8.4
summer 8.4
event 8.3
bar 8.3
meal 8.1
sun 8.1
cabinet 7.9
space 7.8
sofa 7.7
sky 7.7
outdoor 7.6
door 7.6
estate 7.6
tourism 7.4
wine 7.4
life 7.2
residence 7.1
wooden 7

Google
created on 2019-06-05

Photograph 95.8
Text 85.2
Snapshot 84.5
Room 84.3
Class 67.6
Table 63.8
Classroom 63.1
History 57.6
Furniture 56.7
Family 53.1
Building 50.8

Microsoft
created on 2019-06-05

chair 95.5
indoor 94.4
furniture 94.3
table 94.2
floor 93.7
person 86.1
old 77.4
clothing 76.4

Face analysis

Amazon

AWS Rekognition

Age 45-63
Gender Female, 51.8%
Angry 46%
Disgusted 46.5%
Surprised 46%
Happy 47.3%
Calm 45.8%
Sad 47%
Confused 46.4%

AWS Rekognition

Age 26-43
Gender Male, 51.7%
Sad 47.6%
Happy 45%
Angry 51.6%
Calm 45.1%
Surprised 45.1%
Confused 45.4%
Disgusted 45.1%

AWS Rekognition

Age 20-38
Gender Female, 51.1%
Calm 46.1%
Disgusted 46%
Sad 50.4%
Happy 45.7%
Confused 45.5%
Surprised 45.4%
Angry 45.9%

AWS Rekognition

Age 16-27
Gender Female, 53.5%
Sad 47.8%
Disgusted 45.3%
Happy 45.3%
Surprised 45.5%
Angry 45.4%
Calm 50.1%
Confused 45.6%

AWS Rekognition

Age 23-38
Gender Female, 52.6%
Calm 46.5%
Surprised 45.7%
Happy 46.1%
Disgusted 45.4%
Angry 45.7%
Sad 50%
Confused 45.6%

AWS Rekognition

Age 11-18
Gender Female, 54.3%
Happy 45.6%
Surprised 45.7%
Angry 45.6%
Calm 46%
Sad 51%
Disgusted 45.3%
Confused 45.8%

AWS Rekognition

Age 26-43
Gender Female, 54.1%
Calm 47.6%
Disgusted 45.3%
Angry 47.7%
Happy 45.5%
Sad 46.8%
Surprised 45.8%
Confused 46.3%

AWS Rekognition

Age 48-68
Gender Female, 50.3%
Calm 49.5%
Disgusted 49.5%
Sad 49.6%
Happy 49.6%
Confused 49.5%
Surprised 49.5%
Angry 50.2%

AWS Rekognition

Age 19-36
Gender Female, 54.5%
Angry 45.6%
Happy 45.2%
Calm 45.3%
Confused 45.8%
Sad 52%
Disgusted 45.5%
Surprised 45.5%

AWS Rekognition

Age 15-25
Gender Female, 50.2%
Angry 49.5%
Disgusted 49.5%
Surprised 49.5%
Happy 49.5%
Calm 49.6%
Sad 50.3%
Confused 49.5%

AWS Rekognition

Age 11-18
Gender Female, 53.7%
Confused 45.5%
Angry 46.8%
Disgusted 47%
Calm 45.3%
Happy 46.3%
Surprised 45.5%
Sad 48.6%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Confused 49.5%
Surprised 49.5%
Sad 50.4%
Calm 49.5%
Happy 49.5%
Angry 49.5%
Disgusted 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Sad 49.6%
Angry 49.6%
Happy 49.8%
Surprised 49.6%
Confused 49.6%
Calm 49.7%
Disgusted 49.6%

AWS Rekognition

Age 16-27
Gender Female, 53.1%
Angry 46.6%
Disgusted 46%
Calm 46.1%
Surprised 45.5%
Happy 45.6%
Sad 49.8%
Confused 45.5%

AWS Rekognition

Age 20-38
Gender Female, 50.3%
Disgusted 49.5%
Confused 49.5%
Surprised 49.5%
Calm 49.5%
Happy 49.6%
Angry 49.5%
Sad 50.3%

AWS Rekognition

Age 16-27
Gender Male, 50.3%
Calm 46.9%
Disgusted 45.9%
Happy 45.5%
Surprised 46%
Sad 47%
Angry 48.1%
Confused 45.7%

Feature analysis

Amazon

Person 99%
Chair 98.4%

Captions

Microsoft

a vintage photo of a group of people in a room 84.6%
a vintage photo of some people in a room 84.5%
a vintage photo of a group of people standing in a room 78.1%

Text analysis

Amazon

Jerving
C8
MAN

Google

P.S.31
MAN
8
e 8 P.S.31 MAN Housekeeping-Serving
e
Housekeeping-Serving