Human Generated Data

Title

Untitled (group of men in classroom with tower of arranged chairs)

Date

c. 1907

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3865

Human Generated Data

Title

Untitled (group of men in classroom with tower of arranged chairs)

People

Artist: Durette Studio, American 20th century

Date

c. 1907

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 97.3
Human 97.3
Lamp 93.3
Person 87.4
Chandelier 86.5
Person 79.3
Lighting 77.9
Indoors 65.7
Floor 60
Screen 55.5
Electronics 55.5
Room 55.5

Clarifai
created on 2019-06-01

indoors 99
furniture 97.7
people 97.3
room 97.3
inside 95.4
chair 94.3
empty 94.1
light 93.9
window 92.5
no person 89.7
family 89.4
adult 88.6
seat 88.4
architecture 87.4
wood 86.9
group 84.9
contemporary 82.1
house 81.2
man 81
woman 79.3

Imagga
created on 2019-06-01

architecture 34.4
sketch 34
hall 30.7
drawing 29.9
house 29.3
interior 23
building 21.6
home 21.2
modern 21
snow 20.4
window 20.3
design 19.7
room 18.2
structure 17.9
representation 17.8
furniture 17.8
construction 17.1
floor 16.7
city 16.6
decor 15.9
urban 14.9
luxury 14.6
winter 14.5
residential 13.4
travel 13.4
lamp 13.3
table 13
light 12.7
art 12.7
old 12.5
decoration 12.4
marble 12.3
cold 12
ice 11.9
3d 11.6
sky 11.5
scene 11.2
wall 11.2
day 11
glass 10.9
wood 10.8
comfortable 10.5
style 10.4
business 10.3
exterior 10.1
residence 10.1
reflection 10.1
negative 10.1
water 10
new 9.7
chair 9.6
apartment 9.6
estate 9.5
living 9.5
frame 9.2
tree 9.2
clean 9.2
retro 9
door 8.8
balcony 8.8
life 8.6
frozen 8.6
antique 8.6
nobody 8.5
space 8.5
weather 8.3
film 8.3
pattern 8.2
landscape 8.2
office 8.1
indoors 7.9
gymnasium 7.8
built 7.7
architectural 7.7
grunge 7.7
tracing 7.7
real 7.6
lifestyles 7.6
vintage 7.4
classic 7.4
inside 7.4
graphic 7.3
landmark 7.2
mountain 7.1

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

black and white 90.9
furniture 68.8
black 68.5
table 62.2
white 61.4
old 42.3

Face analysis

Amazon

AWS Rekognition

Age 35-52
Gender Female, 50.5%
Disgusted 49.6%
Calm 49.6%
Confused 49.6%
Sad 50%
Surprised 49.6%
Angry 49.6%
Happy 49.5%

AWS Rekognition

Age 15-25
Gender Female, 50.2%
Angry 49.6%
Calm 50.2%
Sad 49.6%
Surprised 49.5%
Disgusted 49.5%
Happy 49.5%
Confused 49.6%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Confused 49.6%
Surprised 49.5%
Calm 49.6%
Sad 49.8%
Happy 49.6%
Disgusted 49.6%
Angry 49.7%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Sad 50%
Happy 49.5%
Confused 49.5%
Angry 49.6%
Surprised 49.5%
Disgusted 49.5%
Calm 49.8%

AWS Rekognition

Age 35-52
Gender Female, 50.3%
Disgusted 49.5%
Sad 50.3%
Surprised 49.5%
Happy 49.5%
Angry 49.5%
Calm 49.6%
Confused 49.5%

AWS Rekognition

Age 12-22
Gender Female, 50.3%
Angry 50.1%
Confused 49.6%
Surprised 49.5%
Happy 49.5%
Sad 49.6%
Calm 49.7%
Disgusted 49.5%

AWS Rekognition

Age 26-43
Gender Female, 50.4%
Confused 49.6%
Angry 49.5%
Sad 49.6%
Calm 50.2%
Disgusted 49.5%
Happy 49.5%
Surprised 49.5%

Feature analysis

Amazon

Person 97.3%

Captions

Microsoft

a black and white photo of a man 57.4%
a black and white photo of a store window 57.3%
a black and white photo of a store 57.2%