Human Generated Data

Title

Untitled (Genest's Bread employee standing with rack of bread next to ovens)

Date

c.1937

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4009

Human Generated Data

Title

Untitled (Genest's Bread employee standing with rack of bread next to ovens)

People

Artist: Durette Studio, American 20th century

Date

c.1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 98.7
Human 98.7
Building 80.5
Machine 69.9
Clothing 69.7
Shoe 69.7
Apparel 69.7
Footwear 69.7
Crowd 65.7
Factory 57.4

Clarifai
created on 2019-06-01

people 98.9
room 96.8
adult 96.6
group 95.9
door 93
group together 92.9
indoors 91.5
bedrock 90.7
wear 90.1
man 89.5
industry 89.2
vehicle 89.1
one 88.2
business 87.4
transportation system 86.4
many 84.5
light 83.5
technology 83.4
monochrome 83.1
server 82.8

Imagga
created on 2019-06-01

stall 32.6
architecture 29.4
gate 26.9
turnstile 26.8
interior 24.8
building 22.9
structure 20.9
construction 20.5
urban 18.3
movable barrier 18
city 17.5
office 16.9
modern 16.1
house 15.9
business 15.8
wall 15.4
boutique 15.2
station 14.5
sketch 14.1
hall 14
window 14
floor 13.9
barrier 13.6
new 12.9
metal 12.9
industry 12.8
steel 12.4
design 11.8
industrial 11.8
inside 11
glass 10.9
light 10.7
sky 10.2
transportation 9.9
project 9.6
concrete 9.6
room 9.4
3d 9.3
door 9.1
drawing 8.6
empty 8.6
roof 8.6
space 8.5
build 8.5
perspective 8.5
exterior 8.3
indoor 8.2
technology 8.2
equipment 8.1
home 8
work 7.8
people 7.8
residence 7.8
entrance 7.7
motion 7.7
architectural 7.7
development 7.6
warehouse 7.6
plan 7.6
power 7.6
place 7.4
open 7.2
travel 7
indoors 7

Google
created on 2019-06-01

Room 65.7
Architecture 65.5
Door 58.3
Black-and-white 56.4
Building 53.4

Microsoft
created on 2019-06-01

building 99.5
outdoor 97.1
black and white 74.6
door 73

Face analysis

Amazon

AWS Rekognition

Age 45-66
Gender Female, 54.1%
Surprised 45.5%
Disgusted 45.3%
Calm 50.2%
Happy 45.7%
Sad 47.1%
Angry 45.4%
Confused 45.9%

Feature analysis

Amazon

Person 98.7%
Shoe 69.7%

Captions

Microsoft

a person standing in front of a building 72.4%
a group of people standing in front of a building 71.3%
a person standing in front of a building 70.4%