Human Generated Data

Title

Races, Indians: United States. New York. Iroquois. Thomas Asylum for Orphan and Destitute Indian Children: State Thomas Asylum for Orphan and Destitute Indian Children, Iroquois, N.Y.: Carpentering

Date

c. 1903

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.2381.2

Human Generated Data

Title

Races, Indians: United States. New York. Iroquois. Thomas Asylum for Orphan and Destitute Indian Children: State Thomas Asylum for Orphan and Destitute Indian Children, Iroquois, N.Y.: Carpentering

People

Artist: Unidentified Artist,

Date

c. 1903

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.2381.2

Machine Generated Data

Tags

Amazon
created on 2019-06-04

Building 99.9
Person 99.6
Human 99.6
Person 99.4
Factory 99.2
Person 98.3
Assembly Line 94.9
Person 94.4
Person 80.2
Housing 69.8
Person 62.7
Manufacturing 55.1

Clarifai
created on 2019-06-04

people 100
group 99.2
group together 98.9
adult 98.9
vehicle 98.4
transportation system 97.3
many 97.2
man 95.5
war 94
room 93.2
furniture 91.4
grinder 91
military 90.2
woman 90.2
railway 90.1
indoors 89.9
two 89.3
several 88.2
aircraft 87.9
home 87.7

Imagga
created on 2019-06-04

dairy 58.1
building 39.9
structure 31.6
architecture 25.3
construction 23.1
house 22.6
factory 22.3
industry 22.2
industrial 20
sky 17.9
urban 17.5
travel 16.2
steel 16.1
inside 14.7
center 14.5
interior 14.2
window 13.7
transportation 13.4
modern 13.3
city 13.3
empty 12.9
home 12.8
warehouse 12.6
wood 12.5
power 11.8
concrete 11.5
station 11.4
greenhouse 11.4
business 10.9
manufacturing 10.7
water 10.7
new 10.5
metal 10.5
plant 10
machine 9.9
old 9.8
wall 9.7
roof 9.7
chair 9.6
work 9.5
transport 9.1
equipment 9.1
sand 9
vacation 9
area 8.9
residential 8.6
engineering 8.6
estate 8.5
restaurant 8.1
light 8
glass 7.8
pipe 7.8
storage 7.6
weather 7.6
cafeteria 7.6
cart 7.5
frame 7.5
floor 7.4
landscape 7.4
tourism 7.4
reflection 7.3
furniture 7.2
sea 7
wheeled vehicle 7

Google
created on 2019-06-04

Building 76.9
Room 71.4
Factory 70.5
Machine 65
Toolroom 58.2
Furniture 53.3
Art 50.2

Microsoft
created on 2019-06-04

person 92.9
indoor 92.7
factory 69.6
clothing 54.5
old 54.1
furniture 53.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 20-38
Gender Female, 50%
Sad 49.6%
Happy 49.5%
Disgusted 50%
Calm 49.5%
Angry 49.8%
Confused 49.5%
Surprised 49.5%

AWS Rekognition

Age 26-43
Gender Male, 50.2%
Sad 49.9%
Happy 49.7%
Calm 49.7%
Surprised 49.6%
Angry 49.6%
Disgusted 49.5%
Confused 49.6%

AWS Rekognition

Age 35-52
Gender Male, 50.2%
Confused 49.6%
Angry 49.6%
Disgusted 49.6%
Surprised 49.6%
Sad 49.6%
Happy 49.9%
Calm 49.6%

AWS Rekognition

Age 38-57
Gender Male, 50.5%
Angry 49.6%
Happy 49.5%
Surprised 49.6%
Calm 49.7%
Confused 49.5%
Sad 49.6%
Disgusted 50%

Feature analysis

Amazon

Person 99.6%

Categories

Text analysis

Google

87
87