Human Generated Data

Title

Untitled (worker digging in excavation site)

Date

c. 1930-1940, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5746

Human Generated Data

Title

Untitled (worker digging in excavation site)

People

Artist: Durette Studio, American 20th century

Date

c. 1930-1940, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5746

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 97.7
Person 97.7
Person 97.5
Wood 96.8
Person 95.6
Construction 89.4
Soil 78.9
Nature 72.3
Banister 68.1
Handrail 68.1
Plywood 67.1
Building 59
Housing 59
Brick 57.1
Outdoors 57
Carpenter 56.3

Clarifai
created on 2019-11-16

monochrome 98.2
people 97.1
black and white 95.2
no person 94.7
transportation system 93.4
street 93.4
train 89.6
vehicle 89.6
abandoned 88.8
grinder 87.6
winter 87.1
railway 86
industry 85.3
house 84.3
calamity 83.5
home 82.1
two 81.6
building 80.8
family 79.8
storm 79.4

Imagga
created on 2019-11-16

building 28.9
structure 27.8
architecture 26.1
travel 21.8
old 20.9
house 20.1
wheeled vehicle 19.4
chair 18.8
wall 17.7
interior 17.7
city 17.5
vehicle 17.4
empty 15.5
construction 15.4
factory 15.3
stone 15.2
urban 14.9
mobile home 14.7
seat 14
track 13.9
metal 13.7
industry 13.7
industrial 13.6
conveyance 13.6
light 13.4
sewage system 12.9
window 12.8
tunnel 12.6
car 12.6
housing 12.6
steel 12.5
tourism 12.4
scene 12.1
floor 12.1
town 12.1
street 12
trailer 11.7
stretcher 11.3
modern 11.2
glass 10.9
dirty 10.8
wood 10.8
black 10.8
room 10.6
facility 10.4
passage 10.3
litter 10
transportation 9.9
place 9.3
inside 9.2
vintage 9.1
device 8.9
snow 8.9
indoors 8.8
freight car 8.7
plant 8.7
door 8.7
water 8.7
passageway 8.6
space 8.5
grunge 8.5
brick 8.5
exterior 8.3
sky 8.3
furniture 8
home 8
railroad 7.9
steps 7.8
antique 7.8
ancient 7.8
broken 7.7
train 7.7
culture 7.7
engineering 7.6
traditional 7.5
garage 7.3
history 7.2
work 7.1
barrier 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 98.1
house 97.1
outdoor 95.7
black and white 92.4
window 75.4
building 58.6

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 24-38
Gender Male, 50.3%
Disgusted 49.5%
Confused 49.5%
Fear 49.5%
Calm 49.5%
Happy 49.5%
Surprised 49.5%
Angry 50.5%
Sad 49.5%

AWS Rekognition

Age 35-51
Gender Male, 50.5%
Calm 49.6%
Happy 49.5%
Angry 49.6%
Disgusted 49.5%
Fear 49.9%
Sad 49.9%
Confused 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 97.7%

Categories