Human Generated Data

Title

Untitled (two men working at construction site)

Date

1951-1957

People

Artist: Orrion Barger, American active 1913 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6418

Human Generated Data

Title

Untitled (two men working at construction site)

People

Artist: Orrion Barger, American active 1913 - 1984

Date

1951-1957

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6418

Machine Generated Data

Tags

Amazon
created on 2019-03-22

Human 96
Person 94.8
Construction 60.2
Wood 58.6
Carpenter 57.5
People 56.6
Building 55.5
Architecture 55.5
Tower 55.5
Clock Tower 55.5

Clarifai
created on 2019-03-22

people 99.4
adult 97.6
industry 95.5
vehicle 94.5
grinder 94.1
one 93.9
man 93.1
transportation system 89.9
production 89.1
watercraft 88
group 85.9
two 85.8
wear 84.2
military 83.3
woman 81.8
science 81.7
war 81.7
scientist 81.5
print 81.4
exploration 81.2

Imagga
created on 2019-03-22

industry 34.2
equipment 29.4
machine 28.8
industrial 28.1
construction 26.5
sky 22.3
crane 19.2
power 18.5
building 18.3
heavy 18.1
transportation 17.9
device 17.2
ship 16.5
work 15.8
steel 15.8
boat 15.7
transport 15.5
vehicle 15.3
vessel 14.8
structure 14.5
harbor 14.4
sea 14.1
business 14
port 13.5
water 13.3
architecture 13.3
environment 13.2
metal 12.9
machinery 12.7
energy 12.6
old 11.8
factory 11.8
fisherman 11.7
urban 11.4
outdoor 10.7
gas 10.6
seller 10.6
engineering 10.5
technology 10.4
site 10.3
gear 10.1
city 10
ocean 10
container 9.9
dock 9.7
rigging 9.7
outdoors 9.7
fuel 9.6
truck 9.6
silhouette 9.1
pipe 8.8
cargo 8.7
pollution 8.7
tower 8.1
shipping 7.9
chair 7.9
destruction 7.8
black 7.8
cloud 7.8
travel 7.7
outside 7.7
house 7.5
landscape 7.4
oil 7.4
tourism 7.4
global 7.3
dirty 7.2
sunset 7.2
pump 7

Google
created on 2019-03-22

Microsoft
created on 2019-03-22

outdoor 92
black 77.3
white 64.6
old 62.4
black and white 62.4
construction 30.9
monochrome 12.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Sad 50%
Disgusted 49.5%
Angry 49.5%
Calm 49.6%
Happy 49.8%
Confused 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 94.8%
Clock Tower 55.5%

Categories