Human Generated Data

Title

Charity, Children: United States. New York. Pleasantville. Hebrew Sheltering Guardian Society: Hebrew Sheltering Guardian Society Orphan Asylum, Pleasantville, New York: Millinery and art classes in the Girls' Technical School.

Date

c. 1900

People

Artist: Wolfson, American

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.149.2

Human Generated Data

Title

Charity, Children: United States. New York. Pleasantville. Hebrew Sheltering Guardian Society: Hebrew Sheltering Guardian Society Orphan Asylum, Pleasantville, New York: Millinery and art classes in the Girls' Technical School.

People

Artist: Wolfson, American

Date

c. 1900

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.149.2

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Human 98.5
Person 98.5
Person 97.7
School 97.1
Person 96.7
Person 96.5
Classroom 96.4
Indoors 96.4
Room 96.4
Person 95.2
Person 92.3
Person 91.7
Restaurant 89.1
Cafeteria 86.6
Person 81.2
Person 80.5
Person 78.6
Person 69.5
Person 66.8
Lab 65.7
Building 57.7
Workshop 57.4
Wood 56.2
Plywood 56.2

Clarifai
created on 2019-06-05

people 99.9
furniture 99.8
room 99.3
group 99.2
many 99.1
desk 98.9
adult 98.7
group together 97.5
table 96.8
chair 95.9
man 95.5
education 94.8
woman 93.8
seat 93.6
indoors 91.5
administration 90.9
several 90.4
dining room 89
employee 88
monochrome 87.4

Imagga
created on 2019-06-05

interior 69
room 65.2
table 59.3
furniture 53
chair 47.3
restaurant 45.4
building 37.1
library 35.5
decor 33.6
house 32.6
modern 32.3
home 31.9
design 31.5
structure 29.6
dining 29.5
wood 27.5
floor 27
kitchen 26.3
style 24.5
luxury 22.3
cafeteria 22.3
indoor 21.9
dinner 21.4
indoors 21.1
decoration 21
glass 20.4
inside 19.3
comfortable 19.1
contemporary 18.8
light 18.7
drink 18.4
architecture 18.3
classroom 17.6
apartment 17.2
counter 17.1
hall 16.9
window 16.7
chairs 16.7
food 16.2
lamp 16.2
stool 15.8
service 15.8
bar 15.7
wall 15.5
seat 15.2
empty 14.6
hotel 14.3
wooden 14.1
tables 13.8
residential 13.4
elegance 12.6
eat 12.6
lunch 12.1
party 12
stove 12
elegant 12
meal 11.9
cook 11.9
plant 11.2
domestic 10.9
dining table 10.8
decorate 10.5
living 10.4
desk 10.1
nobody 10.1
relaxation 10.1
stylish 10
sink 9.9
oven 9.8
residence 9.7
setting 9.6
expensive 9.6
sofa 9.6
lifestyle 9.4
place 9.3
wine 9.2
metal 8.9
catering 8.8
vase 8.8
furnishing 8.7
cabinet 8.6
tile 8.6
space 8.5
drawer 7.9
refrigerator 7.9
serve 7.8
plate 7.8
lighting 7.7
mirror 7.6
relax 7.6
fashion 7.5
rest 7.4
coffee 7.4
island 7.3
dish 7.2

Google
created on 2019-06-05

Photograph 96.2
Snapshot 87.2
Room 86.3
Table 71.7
Furniture 69.1
Classroom 64.6
Stock photography 64
Photography 62.4
Class 60.4
History 54.1
Building 53.4

Microsoft
created on 2019-06-05

table 99.7
indoor 99.3
wall 96.6
library 94.5
window 88.4
person 84
clothing 69.3
chair 52.6
furniture 18.6
cluttered 14.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 50.9%
Disgusted 45.4%
Happy 45.4%
Angry 46.5%
Surprised 45.7%
Sad 48.1%
Confused 46.9%
Calm 46.9%

AWS Rekognition

Age 14-25
Gender Male, 51.4%
Confused 45.7%
Surprised 45%
Angry 46.1%
Happy 45.1%
Sad 47.1%
Calm 50.9%
Disgusted 45%

AWS Rekognition

Age 9-14
Gender Male, 53.9%
Happy 46.3%
Angry 47.3%
Calm 46.4%
Confused 46.2%
Disgusted 46%
Surprised 46%
Sad 46.9%

AWS Rekognition

Age 16-27
Gender Female, 54.6%
Happy 46.1%
Sad 45.7%
Confused 45.5%
Calm 50.7%
Angry 46%
Disgusted 45.3%
Surprised 45.6%

AWS Rekognition

Age 19-36
Gender Female, 52.5%
Confused 45.4%
Angry 45.9%
Disgusted 45.2%
Calm 46.5%
Happy 45.3%
Surprised 45.3%
Sad 51.3%

AWS Rekognition

Age 23-38
Gender Female, 50.5%
Disgusted 49.5%
Sad 50%
Calm 50%
Confused 49.5%
Happy 49.5%
Surprised 49.5%
Angry 49.5%

AWS Rekognition

Age 20-38
Gender Female, 53.7%
Calm 46.3%
Happy 45.1%
Sad 52.6%
Angry 45.4%
Confused 45.2%
Disgusted 45.1%
Surprised 45.2%

AWS Rekognition

Age 16-27
Gender Female, 50.3%
Disgusted 49.6%
Confused 49.5%
Happy 49.5%
Sad 49.5%
Surprised 49.5%
Calm 49.5%
Angry 50.4%

AWS Rekognition

Age 17-27
Gender Female, 50.2%
Happy 49.6%
Surprised 49.6%
Confused 49.5%
Disgusted 49.7%
Calm 49.8%
Angry 49.7%
Sad 49.6%

AWS Rekognition

Age 23-38
Gender Female, 50.2%
Angry 49.6%
Disgusted 49.6%
Sad 49.6%
Surprised 49.6%
Happy 49.6%
Calm 49.7%
Confused 49.7%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Angry 49.6%
Calm 49.7%
Sad 50%
Surprised 49.5%
Disgusted 49.5%
Happy 49.6%
Confused 49.5%

AWS Rekognition

Age 15-25
Gender Female, 53.9%
Angry 46.3%
Disgusted 46.5%
Sad 49.9%
Calm 45.5%
Happy 45.5%
Surprised 45.7%
Confused 45.6%

AWS Rekognition

Age 17-27
Gender Female, 50.1%
Calm 49.7%
Sad 50.1%
Surprised 49.5%
Angry 49.6%
Happy 49.5%
Confused 49.5%
Disgusted 49.5%

AWS Rekognition

Age 35-52
Gender Female, 50.2%
Angry 49.6%
Happy 49.6%
Confused 49.5%
Surprised 49.6%
Sad 49.6%
Disgusted 50%
Calm 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.5%
Happy 49.8%
Angry 49.6%
Calm 49.7%
Sad 49.7%
Surprised 49.6%
Confused 49.6%
Disgusted 49.6%

AWS Rekognition

Age 26-43
Gender Female, 50.3%
Sad 49.8%
Calm 49.8%
Happy 49.7%
Surprised 49.5%
Angry 49.6%
Disgusted 49.5%
Confused 49.6%

Feature analysis

Amazon

Person 98.5%

Categories

Imagga

interior objects 99.9%

Captions

Text analysis

Amazon

Armour's
MODEL
EISO

Google

ERSON CoP Armours GRAPE JUICE HUPMOBILE 191S MODEL TT TT ALESO
ERSON
CoP
JUICE
191S
ALESO
Armours
GRAPE
HUPMOBILE
MODEL
TT