Human Generated Data

Title

Charity, Children: United States. New York. Syracuse. Onandaga Orphan Asylum: Orphan Asylum of Ononadaga County, N.Y.: Kitchen

Date

c. 1900

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.673.2

Human Generated Data

Title

Charity, Children: United States. New York. Syracuse. Onandaga Orphan Asylum: Orphan Asylum of Ononadaga County, N.Y.: Kitchen

People

Artist: Unidentified Artist,

Date

c. 1900

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-07

Person 99.2
Human 99.2
Person 99.1
Person 97.5
Person 96.8
Person 91.3
Clinic 79.9
Building 65.2
Workshop 58.9
Hospital 58.7
Clothing 57.9
Apparel 57.9
Painting 56.1
Art 56.1
Factory 55.8

Clarifai
created on 2019-06-07

people 100
group 99.6
adult 99.2
many 98.7
room 98.4
furniture 98
group together 97.8
leader 97
administration 96.3
man 96
wear 94.8
woman 94.8
several 94.5
indoors 92.9
home 92.3
veil 89.7
education 89
military 87.7
chair 87.5
seat 86.4

Imagga
created on 2019-06-07

room 44.8
interior 43.3
chair 34
home 29.5
furniture 25.8
barbershop 24.3
table 23.6
house 23.4
shop 21.4
decor 21.2
indoors 21.1
inside 18.4
modern 18.2
design 18
floor 17.7
wood 17.5
old 16
decoration 16
architecture 15.6
window 15.6
mercantile establishment 15.5
nurse 14.3
style 14.1
indoor 13.7
wall 13.7
light 13.4
lamp 13.3
elegance 12.6
luxury 12
chairs 11.7
seat 11.7
people 11.7
glass 11.7
comfortable 11.5
dining 11.4
hospital 11.3
restaurant 11.2
classroom 11.1
apartment 10.5
place of business 10.5
vintage 9.9
person 9.9
family 9.8
wooden 9.7
office 9.6
mirror 9.5
nobody 9.3
dinner 9.3
life 9
building 8.9
kitchen 8.7
residential 8.6
empty 8.6
living 8.5
bathroom 8.3
domestic 8.2
retro 8.2
tables 7.9
patient 7.9
scene 7.8
space 7.8
sitting 7.7
hotel 7.6
antique 7.6
relaxation 7.5
city 7.5
new 7.3
lifestyle 7.2
musical instrument 7.1

Google
created on 2019-06-07

Microsoft
created on 2019-06-07

indoor 97.7
table 95.9
floor 95.6
clothing 91
person 88.7
room 81.9
building 81.9
chair 81.2
white 77.4
woman 74
old 68.5
furniture 35.6

Face analysis

Amazon

AWS Rekognition

Age 26-44
Gender Male, 53.9%
Calm 46.9%
Angry 45.7%
Confused 46.2%
Surprised 45.3%
Sad 49.1%
Happy 45.6%
Disgusted 46.2%

AWS Rekognition

Age 15-25
Gender Female, 50.7%
Confused 45.3%
Calm 45.7%
Happy 46%
Disgusted 45.2%
Sad 51.6%
Angry 45.8%
Surprised 45.4%

AWS Rekognition

Age 11-18
Gender Female, 50.4%
Disgusted 49.6%
Surprised 49.7%
Sad 49.6%
Confused 49.6%
Calm 49.7%
Happy 49.7%
Angry 49.6%

AWS Rekognition

Age 26-43
Gender Female, 54%
Disgusted 45.2%
Surprised 45.4%
Confused 45.3%
Sad 45.7%
Happy 45.7%
Calm 52.3%
Angry 45.3%

AWS Rekognition

Age 17-27
Gender Female, 51.8%
Confused 45.7%
Sad 45.9%
Disgusted 45.1%
Calm 51.7%
Happy 45.8%
Surprised 45.3%
Angry 45.5%

Feature analysis

Amazon

Person 99.2%
Painting 56.1%

Captions

Microsoft

a vintage photo of a living room 82.1%
a black and white photo of a living room 74.4%
a group of people in a room 74.3%

Text analysis

Amazon

to