Human Generated Data

Title

Untitled (bakery employees posing with donuts and donut machinery)

Date

1938

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4296

Human Generated Data

Title

Untitled (bakery employees posing with donuts and donut machinery)

People

Artist: Durette Studio, American 20th century

Date

1938

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.3
Human 99.3
Person 98.2
Shop 92.9
Indoors 91.9
Interior Design 91.9
Apparel 79.4
Clothing 79.4
Text 65.7
Person 64.9
People 64.1
Newsstand 62.4
Bazaar 58.4
Market 58.4
Window Display 57.9
Overcoat 57.6
Coat 57.6
Suit 57.6

Clarifai
created on 2019-06-01

people 99.9
adult 99.2
group together 98.1
group 97.4
man 97
two 96.4
room 95.9
monochrome 95.3
furniture 95.1
many 93.9
indoors 92.7
woman 92.4
one 92.1
street 90.5
vehicle 89.8
airport 89.1
three 86.1
grinder 85.8
transportation system 85.7
several 84.9

Imagga
created on 2019-06-01

shopping cart 36.3
counter 29.5
handcart 28.3
structure 26.2
architecture 25.3
balcony 21.8
wheeled vehicle 21.6
water 20.7
modern 20.3
building 19.8
urban 18.3
container 18.1
house 17.5
city 17.5
chair 16.4
sky 15.9
window 15.8
construction 15.4
steel 15
interior 14.1
travel 13.4
furniture 13.3
wall 12.8
seat 12.4
business 12.1
stall 12.1
rocking chair 12.1
glass 11.7
transportation 11.6
river 11.6
transport 11
sea 10.9
industrial 10.9
office 10.6
reflection 10.6
boat 10.5
home 10.5
bridge 10.4
light 10
metal 9.7
industry 9.4
equipment 9.3
floor 9.3
ship 9
tower 8.9
door 8.5
gate 8.4
silhouette 8.3
indoor 8.2
plant 8.2
technology 8.2
airport 7.8
black 7.8
people 7.8
high 7.8
roof 7.6
ocean 7.5
tourism 7.4
exterior 7.4
street 7.4
conveyance 7.3
design 7.3
vessel 7.2
station 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

window 99.1
person 88.4
black and white 85.9
clothing 84.7
man 73.4
building 60

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 54.8%
Disgusted 45.1%
Sad 47.3%
Happy 45.2%
Confused 45.1%
Surprised 45.1%
Angry 45.2%
Calm 52%

AWS Rekognition

Age 26-43
Gender Female, 51.2%
Disgusted 46.8%
Sad 48.8%
Surprised 45.6%
Happy 46.1%
Angry 46.2%
Calm 45.7%
Confused 45.7%

Feature analysis

Amazon

Person 99.3%

Captions

Microsoft

a person standing in front of a window 82.3%
a person standing next to a window 76.4%
a person standing next to a window 59.9%

Text analysis

Amazon

0000
oop000o
aoooodcoo
0000ooo
00000001
o608e00