Human Generated Data

Title

Education, Industrial: United States. New York. New York City. Vacation Schools: New York City Public Schools. Examples of the Adaptation of Education to Special City Needs: Public School No. 147 Manhattan.: Vacation School - Carpenter Shop.

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.1458.2

Human Generated Data

Title

Education, Industrial: United States. New York. New York City. Vacation Schools: New York City Public Schools. Examples of the Adaptation of Education to Special City Needs: Public School No. 147 Manhattan.: Vacation School - Carpenter Shop.

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.1458.2

Machine Generated Data

Tags

Amazon
created on 2019-06-04

Person 99.6
Human 99.6
Person 99.5
Person 99.4
Person 99.3
Person 97.9
Person 97.6
Person 94
Person 91.3
Clinic 90.9
Person 89.9
Person 87.2
Workshop 87.2
Ceiling Fan 83.7
Appliance 83.7
Person 82.5
Person 79.6
Person 79.5
Lab 76.2
Person 73.7
Person 71.2
Machine 65.5
Table 63.7
Furniture 63.7
Shoe 60.6
Apparel 60.6
Footwear 60.6
Clothing 60.6
Hospital 59.4
Operating Theatre 57.5
Plywood 56.5
Wood 56.5
Person 46.7

Clarifai
created on 2019-06-04

people 99.9
group 98.7
many 97.9
adult 97.5
military 97.5
man 97
war 96.5
group together 94.9
soldier 92.7
room 91.8
furniture 89.7
vehicle 87.9
administration 86.4
print 85.3
art 84.9
indoors 83.5
crowd 82.8
wear 82.4
woman 81.3
education 80.9

Imagga
created on 2019-06-04

barbershop 59.2
shop 49.1
mercantile establishment 37.2
house 26.8
cemetery 24.8
place of business 24.6
architecture 23.4
travel 19
city 18.3
home 16.7
interior 15.9
old 15.3
chair 15
tourism 14.8
building 14.8
room 14.2
sky 12.1
table 12.1
establishment 12
water 11.3
inside 11
antique 10.7
landscape 10.4
window 10.1
light 10
history 9.8
modern 9.8
glass 9.3
town 9.3
vintage 9.1
tourist 9.1
landmark 9
vacation 9
vehicle 8.7
sea 8.7
ancient 8.6
roof 8.6
luxury 8.6
wall 8.5
bridge 8.5
furniture 8.4
floor 8.4
wood 8.3
church 8.3
street 8.3
structure 8.2
indoors 7.9
urban 7.9
kitchen 7.8
culture 7.7
winter 7.7
old fashioned 7.6
retro 7.4
design 7.3
snow 7.2
machine 7.2
transportation 7.2
religion 7.2
tower 7.2
summer 7.1

Google
created on 2019-06-04

Room 78
Building 60.6
Vintage clothing 60
Furniture 59.4
Table 58.6
History 57.6
Art 50.2

Microsoft
created on 2019-06-04

person 95.8
indoor 90.1
old 85.9
clothing 84
furniture 66.4
table 64.5
working 60.3
several 16.1

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 50.7%
Happy 45.4%
Angry 50.2%
Sad 47.1%
Disgusted 45.3%
Confused 45.1%
Calm 46.8%
Surprised 45.1%

AWS Rekognition

Age 20-38
Gender Female, 53.3%
Angry 45.8%
Calm 52.2%
Confused 45.3%
Disgusted 45.1%
Happy 45.4%
Sad 45.7%
Surprised 45.5%

AWS Rekognition

Age 16-27
Gender Female, 50.4%
Confused 49.5%
Angry 49.5%
Calm 49.5%
Disgusted 49.5%
Sad 50.4%
Happy 49.5%
Surprised 49.5%

AWS Rekognition

Age 20-38
Gender Male, 52.2%
Angry 45.8%
Sad 48.2%
Surprised 45.4%
Confused 47.6%
Disgusted 45.1%
Happy 45%
Calm 47.9%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Sad 50.4%
Happy 49.5%
Calm 49.5%
Disgusted 49.5%
Angry 49.5%
Surprised 49.5%
Confused 49.5%

AWS Rekognition

Age 10-15
Gender Female, 50.3%
Disgusted 49.5%
Happy 49.6%
Confused 49.6%
Sad 50.2%
Calm 49.5%
Angry 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Sad 50.1%
Disgusted 49.5%
Surprised 49.5%
Angry 49.5%
Happy 49.8%
Calm 49.5%
Confused 49.5%

AWS Rekognition

Age 17-27
Gender Female, 50.4%
Calm 49.7%
Confused 49.5%
Sad 50.2%
Angry 49.5%
Happy 49.5%
Disgusted 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Disgusted 49.6%
Calm 49.7%
Happy 49.6%
Angry 49.6%
Sad 50%
Confused 49.6%
Surprised 49.6%

AWS Rekognition

Age 27-44
Gender Female, 50.4%
Surprised 49.5%
Angry 49.7%
Calm 49.7%
Confused 49.6%
Disgusted 49.6%
Happy 49.6%
Sad 49.9%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Disgusted 49.6%
Happy 49.5%
Sad 49.7%
Calm 49.5%
Angry 49.8%
Confused 49.7%
Surprised 49.7%

AWS Rekognition

Age 35-52
Gender Female, 50.5%
Surprised 49.6%
Angry 49.5%
Calm 49.8%
Happy 49.8%
Sad 49.6%
Disgusted 49.7%
Confused 49.6%

AWS Rekognition

Age 20-38
Gender Female, 50.4%
Surprised 49.6%
Disgusted 49.6%
Angry 49.6%
Calm 49.7%
Sad 49.9%
Happy 49.6%
Confused 49.6%

AWS Rekognition

Age 11-18
Gender Male, 50.3%
Angry 49.5%
Surprised 49.5%
Calm 49.6%
Happy 49.6%
Disgusted 49.5%
Sad 50.2%
Confused 49.5%

AWS Rekognition

Age 14-25
Gender Female, 50.3%
Sad 49.8%
Happy 49.5%
Calm 49.7%
Disgusted 49.5%
Angry 49.7%
Surprised 49.6%
Confused 49.7%

Feature analysis

Amazon

Person 99.6%
Ceiling Fan 83.7%
Shoe 60.6%

Categories

Text analysis

Google

PS. MANHATTAN
PS.
MANHATTAN