Human Generated Data

Title

Untitled (double exposure of man standing in room against floral wallpaper)

Date

c. 1905

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3873

Human Generated Data

Title

Untitled (double exposure of man standing in room against floral wallpaper)

People

Artist: Durette Studio, American 20th century

Date

c. 1905

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.3873

Machine Generated Data

Tags

Amazon
created on 2019-06-01

Person 99.4
Human 99.4
Apparel 92.1
Clothing 92.1
Indoors 87.6
Interior Design 87.6
Room 80.1
Door 76.9
Floor 75.7
Home Decor 71.4
Flooring 68.2
People 62.6
Suit 59.5
Overcoat 59.5
Coat 59.5
Corridor 59.3

Clarifai
created on 2019-06-01

people 100
adult 99.1
one 99.1
leader 98.7
administration 98.6
wear 94.8
man 94.5
portrait 93.4
furniture 92.9
two 92.8
group 92.6
woman 91.8
home 91.7
room 88.8
chair 84.7
outfit 81.6
indoors 81.5
doorway 80.4
group together 77
seat 75.4

Imagga
created on 2019-06-01

pay-phone 40.1
telephone 35.4
device 28.4
electronic equipment 25
wall 22.2
equipment 21.3
door 19
machine 17.6
old 16.7
cash machine 16.7
elevator 16.3
city 15.8
locker 15.2
building 13.6
people 13.4
architecture 13.3
fastener 13.1
urban 13.1
lifting device 13.1
man 12.8
black 12
street 12
house 11.7
sliding door 11.1
home 10.4
portrait 10.3
adult 10.3
men 10.3
window 10.1
vintage 9.9
call 9.9
restraint 9.9
room 9.8
human 9.7
person 9.6
body 9.6
light 9.4
stone 9.3
construction 8.5
grunge 8.5
male 8.5
travel 8.4
history 8
parking meter 8
sexy 8
interior 8
lifestyle 7.9
hair 7.9
indoors 7.9
box 7.9
ancient 7.8
wood 7.5
outdoors 7.5
timer 7.3
alone 7.3
metal 7.2
life 7.2

Google
created on 2019-06-01

Microsoft
created on 2019-06-01

clothing 97.6
wall 95.8
indoor 94.5
person 94.5
black and white 90.1
white 82.2
man 78.8
standing 77
coat 58.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 23-38
Gender Female, 52.2%
Sad 48.2%
Happy 46.6%
Surprised 45.8%
Calm 47.6%
Disgusted 45.5%
Confused 45.7%
Angry 45.7%

Feature analysis

Amazon

Person 99.4%

Categories