Human Generated Data

Title

Untitled (Genest's Bread employees standing next to doughnut machine)

Date

c.1937

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4012

Human Generated Data

Title

Untitled (Genest's Bread employees standing next to doughnut machine)

People

Artist: Durette Studio, American 20th century

Date

c.1937

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Human 98.8
Person 98.8
Person 98
Shop 93.8
Newsstand 81.1
Kiosk 58.3
Furniture 57.1
Bazaar 57
Market 57
Urban 55.5

Clarifai
created on 2019-06-01

furniture 98
room 96.5
people 96.3
inside 93.7
chair 92.4
table 92
indoors 91.6
technology 91.3
modern 89.5
seat 89.2
man 88.3
wedding 86.2
exhibition 84.8
luxury 84.7
group 84.6
woman 84.1
desktop 80.2
industry 78.3
monochrome 77.9
adult 76.9

Imagga
created on 2019-06-01

white goods 100
dishwasher 100
home appliance 81.5
appliance 56.7
durables 27.2
supermarket 22.1
modern 20.3
architecture 20.3
house 18.4
home 18.3
building 16.7
3d 16.3
interior 15.9
glass 15.5
business 14
shopping 13.8
shop 13.7
construction 13.7
design 13.5
grocery store 13.1
technology 12.6
lifestyle 12.3
sketch 11.7
city 11.6
floor 11.2
people 11.1
equipment 11
marketplace 10.8
mercantile establishment 10.8
architect 10.6
drawing 10.5
wall 10.3
room 10
blueprint 9.8
digital 9.7
project 9.6
development 9.5
plan 9.4
industry 9.4
buy 9.4
window 9.2
indoor 9.1
chair 9.1
furniture 8.9
science 8.9
metal 8.8
urban 8.7
man 8.7
light 8.7
residential 8.6
store 8.5
casual 8.5
work 8.5
product 8.3
person 8.3
industrial 8.2
style 8.2
new 8.1
office 8
water 8
basket 7.9
render 7.8
empty 7.7
architectural 7.7
engineering 7.6
perspective 7.5
inside 7.4
idea 7.1
working 7.1
medical 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

furniture 68.3
white 60.4
shop 14.3

Face analysis

Amazon

AWS Rekognition

Age 14-23
Gender Female, 51.4%
Happy 45.5%
Disgusted 45.2%
Angry 45.3%
Calm 50.8%
Surprised 45.5%
Confused 45.6%
Sad 47.2%

Feature analysis

Amazon

Person 98.8%

Captions

Microsoft

a black and white photo of a store window 80.3%
a group of people standing in front of a window 68.7%
a group of people in front of a window 68.6%