Human Generated Data

Title

Untitled (Genest's Bread bread company employees on front step of building)

Date

c.1937

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4004

Human Generated Data

Title

Untitled (Genest's Bread bread company employees on front step of building)

People

Artist: Durette Studio, American 20th century

Date

c.1937

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.4004

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.6
Human 99.6
Person 99.4
Person 99.4
Person 99.3
Person 99.3
Person 99.2
Person 99.2
Person 99
Person 98.5
Person 97.8
Person 97.5
Person 94.6
Person 93.1
Person 90.3
Floor 89.8
Person 89.1
Flooring 80.4
People 78.7
Person 76.9
Person 74.5
Clothing 64.9
Apparel 64.9
Crowd 60.8
Indoors 59.8
Room 56.4
Person 43.3

Clarifai
created on 2019-06-01

room 97.2
people 96.1
indoors 94
woman 89.7
inside 88.2
adult 86.9
rack 86.3
monochrome 85.1
many 85.1
cabinet 84.8
technology 84.6
business 83.8
man 81.5
modern 81
family 80.4
desktop 78.7
order 77.6
server 77.4
row 76.7
furniture 76.2

Imagga
created on 2019-06-01

picket fence 23.8
structure 22
fence 21.7
art 19.2
fountain 15.9
barrier 15.5
flag 15.2
architecture 15
negative 14.8
building 14.8
design 14.7
city 14.1
business 12.7
house 12.6
graphic 12.4
scene 11.2
old 11.1
grunge 11.1
film 10.7
retro 10.6
sign 10.5
urban 10.5
water 10
obstruction 9.7
people 9.5
silhouette 9.1
decoration 9
marble 8.9
sky 8.9
cool 8.9
symbol 8.7
crowd 8.6
construction 8.6
winter 8.5
sculpture 8.4
modern 8.4
clip 8.4
office 8.3
snow 8.3
vintage 8.3
outdoors 8.2
ice 8.2
style 8.2
gymnasium 8.1
group 8.1
life 7.9
waving 7.8
patriotism 7.7
glass 7.7
ripple 7.6
net 7.5
exterior 7.4
light 7.3
speed 7.3
national 7.2
black 7.2
antique 7.1
information 7.1
day 7.1
travel 7
country 7

Google
created on 2019-06-01

Photograph 96.6
Text 85.2
Snapshot 82.5
Room 65.7
Photography 62.4
Black-and-white 56.4

Microsoft
created on 2019-06-01

person 92.2
clothing 83.7
white 61.7
black and white 52.3

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 16-27
Gender Female, 54%
Sad 45.8%
Angry 46.4%
Disgusted 50.9%
Surprised 45.7%
Happy 45.9%
Calm 45.1%
Confused 45.3%

AWS Rekognition

Age 35-55
Gender Female, 52.3%
Angry 45.8%
Disgusted 46.3%
Confused 45.5%
Calm 46.6%
Sad 49.3%
Surprised 45.7%
Happy 45.8%

AWS Rekognition

Age 27-44
Gender Female, 53.7%
Surprised 46%
Confused 45.5%
Disgusted 47.4%
Happy 47.9%
Sad 46.6%
Calm 45.2%
Angry 46.4%

AWS Rekognition

Age 10-15
Gender Male, 51.7%
Confused 46%
Surprised 45.3%
Happy 45.3%
Calm 47.2%
Sad 49.2%
Disgusted 46%
Angry 46%

AWS Rekognition

Age 20-38
Gender Female, 52.8%
Surprised 46%
Sad 48.2%
Angry 46.4%
Disgusted 47.3%
Calm 45.3%
Happy 46.4%
Confused 45.5%

AWS Rekognition

Age 35-53
Gender Male, 50.8%
Angry 46%
Calm 47.5%
Sad 46.2%
Surprised 45.7%
Disgusted 47.7%
Happy 46.4%
Confused 45.5%

AWS Rekognition

Age 26-44
Gender Female, 54.7%
Surprised 45.8%
Sad 48.5%
Happy 46.5%
Angry 46%
Disgusted 46.9%
Confused 45.4%
Calm 45.8%

AWS Rekognition

Age 23-38
Gender Male, 53.8%
Disgusted 48%
Calm 45.7%
Sad 46.4%
Confused 45.6%
Angry 45.8%
Surprised 45.8%
Happy 47.8%

AWS Rekognition

Age 27-44
Gender Female, 54.5%
Confused 45.3%
Disgusted 50.6%
Happy 45.6%
Surprised 45.7%
Calm 45.2%
Sad 46%
Angry 46.5%

AWS Rekognition

Age 19-36
Gender Female, 53.9%
Happy 45.9%
Sad 46.1%
Surprised 46%
Confused 45.5%
Disgusted 48.6%
Calm 45.6%
Angry 47.3%

AWS Rekognition

Age 26-43
Gender Female, 53.1%
Angry 46%
Happy 47.9%
Sad 45.7%
Disgusted 47%
Confused 45.4%
Calm 47.2%
Surprised 45.8%

AWS Rekognition

Age 48-68
Gender Female, 50.5%
Angry 45.2%
Disgusted 45%
Confused 45.1%
Calm 45.1%
Sad 54.6%
Surprised 45%
Happy 45%

Feature analysis

Amazon

Person 99.6%

Categories