Human Generated Data

Title

Races, Jews: United States. New Jersey. Woodbine. Baron de Hirsch Agricultural and Industrial School: Woodbine Settlement and School, Woodbine, N.J. Baron de Hirsch Fund.: 55. Interior of Machine Shop.

Date

c. 1904

People

Artist: Unidentified Artist,

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.3549.2

Human Generated Data

Title

Races, Jews: United States. New Jersey. Woodbine. Baron de Hirsch Agricultural and Industrial School: Woodbine Settlement and School, Woodbine, N.J. Baron de Hirsch Fund.: 55. Interior of Machine Shop.

People

Artist: Unidentified Artist,

Date

c. 1904

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, Social Museum Collection, 3.2002.3549.2

Machine Generated Data

Tags

Amazon
created on 2019-06-05

Assembly Line 100
Building 100
Factory 100
Human 99.6
Person 99.6
Person 99.5
Person 99.4
Person 98.5
Person 97.4
Person 97.3
Person 96.9
Person 90
Person 86.2
Painting 69.8
Art 69.8

Clarifai
created on 2019-06-05

people 100
group together 99.3
many 99.3
vehicle 99.3
transportation system 99.2
group 99.2
adult 97.3
train 96.6
man 95.9
military 95.6
aircraft 95.5
railway 95
crowd 93.1
war 89.2
wear 88.6
watercraft 88.4
airplane 88.3
street 86.8
indoors 86.7
several 83.8

Imagga
created on 2019-06-05

building 36.2
urban 34.1
industrial 26.3
station 25.2
sketch 24.3
industry 23.9
supermarket 23.7
city 21.6
architecture 21.6
modern 20.3
drawing 18.9
steel 18.7
interior 18.6
grocery store 18.5
factory 18.3
transportation 17.9
construction 17.1
shopping cart 16.6
power 15.9
business 15.8
structure 15.7
metal 15.3
inside 14.7
hall 14.6
handcart 14.3
engineering 14.3
wheeled vehicle 13.9
marketplace 13.8
reflection 13.8
window 13.7
crowd 13.4
representation 13.3
floor 13
airport 12.8
transport 12.8
travel 12.7
technology 12.6
people 12.3
plant 11.9
train 11.8
pipe 11.7
tube 11.6
perspective 11.3
scene 11.2
wall 11.2
glass 10.9
metro 10.8
subway 10.8
corridor 10.8
work 10.4
pipes 9.8
mercantile establishment 9.7
office 9.6
heavy 9.5
high 9.5
empty 9.4
men 9.4
old 9.1
lines 9
passenger 8.9
ceiling 8.8
line 8.8
manufacturing 8.8
water 8.7
pollution 8.6
move 8.6
concrete 8.6
motion 8.6
container 8.6
gate 8.5
sky 8.3
speed 8.2
new 8.1
futuristic 8.1
man 8.1
light 8
piping 7.9
equipment 7.9
railway 7.8
wire 7.8
stairs 7.8
mechanical 7.8
waste 7.8
steam 7.8
supply 7.7
facility 7.7
gas 7.7
roof 7.6
estate 7.6
walking 7.6
silhouette 7.4
life 7.3
indoor 7.3
tunnel 7.1
indoors 7

Google
created on 2019-06-05

Microsoft
created on 2019-06-05

clothing 92.8
person 90.4
man 54.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 15-25
Gender Male, 50.5%
Happy 45.4%
Confused 45.2%
Disgusted 47.5%
Sad 46.7%
Calm 47%
Angry 47.9%
Surprised 45.2%

AWS Rekognition

Age 26-43
Gender Female, 51.2%
Sad 46.3%
Happy 45.2%
Angry 45.6%
Calm 51.4%
Disgusted 45.3%
Confused 45.5%
Surprised 45.7%

AWS Rekognition

Age 26-43
Gender Male, 50.5%
Sad 49.8%
Disgusted 49.6%
Confused 49.5%
Happy 49.5%
Calm 49.7%
Surprised 49.5%
Angry 49.8%

AWS Rekognition

Age 49-69
Gender Female, 52.6%
Angry 45.5%
Surprised 45.3%
Sad 46.5%
Disgusted 45.2%
Confused 45.3%
Happy 45.3%
Calm 52.1%

AWS Rekognition

Age 60-80
Gender Male, 50.2%
Calm 49.5%
Disgusted 49.5%
Confused 49.5%
Angry 49.5%
Sad 50.4%
Surprised 49.5%
Happy 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Happy 49.6%
Disgusted 49.6%
Confused 49.6%
Surprised 49.6%
Sad 50%
Angry 49.6%
Calm 49.7%

AWS Rekognition

Age 15-25
Gender Female, 50.3%
Sad 50.4%
Disgusted 49.5%
Happy 49.5%
Angry 49.6%
Surprised 49.5%
Calm 49.5%
Confused 49.5%

AWS Rekognition

Age 20-38
Gender Female, 50.5%
Sad 49.6%
Happy 50.2%
Angry 49.5%
Calm 49.6%
Disgusted 49.5%
Confused 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 99.6%
Painting 69.8%