Human Generated Data

Title

Frank Residence, Pittsburgh, Pennsylvania, 1939-1940

Date

c. 1939-1940

People

Artist: Unidentified Artist,

Artist: Walter Gropius, German 1883 - 1969

Classification

Archival Material

Credit Line

Harvard Art Museums/Busch-Reisinger Museum, Gift of Ise Gropius, BRGA.90.283

Human Generated Data

Title

Frank Residence, Pittsburgh, Pennsylvania, 1939-1940

People

Artist: Unidentified Artist,

Artist: Walter Gropius, German 1883 - 1969

Date

c. 1939-1940

Classification

Archival Material

Machine Generated Data

Tags

Amazon
created on 2019-04-17

Wood 100
Person 99.3
Human 99.3
Person 97.9
Plywood 84.4
Person 81.9
Lumber 81.1
Construction 70.7
Building 64.7
Machine 57.4
Nature 56.6

Clarifai
created on 2019-04-17

Imagga
created on 2019-04-17

chairlift 60.8
factory 55.7
ski tow 50
conveyance 40.6
industry 38.4
structure 33
plant 32.7
industrial 32.7
building 27.3
power 26
steel 25.3
construction 23.1
station 21.3
energy 20.2
metal 19.3
sky 18.5
pipe 17.8
building complex 17.7
engineering 17.1
heavy 16.2
machine 15.8
concrete 15.3
work 14.9
urban 14.9
technology 14.1
architecture 14.1
iron 14
oil 13.9
equipment 13.7
fuel 13.5
production 12.6
pollution 12.5
high 12.1
modern 11.9
transportation 11.7
city 11.6
gas 11.6
water 11.3
electricity 11.3
travel 11.3
transport 11
business 10.9
pump 10.8
machinery 10.7
environment 10.7
tube 10.6
system 10.5
old 10.5
electric 10.3
ship 10
wood 10
piping 9.9
tower 9.8
new 9.7
train 9.6
boat 9.3
house 9.2
outdoors 9
refinery 8.9
pipeline 8.9
cables 8.8
device 8.8
waste 8.7
complex 8.7
part 8.7
environmental 8.5
wire 8.2
marina 8.1
tank 8
snow 8
pipes 7.9
valve 7.9
manufacturing 7.8
track 7.8
engineer 7.8
supply 7.7
winter 7.7
outdoor 7.6
heat 7.4
exterior 7.4
global 7.3
lines 7.2
activity 7.2
vehicle 7

Google
created on 2019-04-17

Lumber 72.6
Vehicle 64.4
Wood 59.6

Microsoft
created on 2019-04-17

factory 95.3
outdoor 90
old 68.4
black and white 65.3
wooden 65.3
construction 43.1
snow 27.6
winter 23.7

Face analysis

Amazon

Google

AWS Rekognition

Age 20-38
Gender Female, 54.4%
Angry 49.4%
Calm 45.3%
Surprised 45.4%
Disgusted 45.8%
Confused 45.3%
Sad 47.9%
Happy 45.8%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Likely

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

an old photo of a factory 89.5%
old photo of a factory 85.9%
an old photo of a building 83%