Human Generated Data

Title

Untitled (man sitting on bench on porch)

Date

c. 1935, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6052

Human Generated Data

Title

Untitled (man sitting on bench on porch)

People

Artist: Durette Studio, American 20th century

Date

c. 1935, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.6052

Machine Generated Data

Tags

Amazon
created on 2019-05-30

Indoors 99.9
Interior Design 99.9
Room 92.9
Advertisement 88.7
Poster 88.7
Human 86.2
Person 86.2
Banister 85.4
Handrail 85.4
Electronics 84.6
Screen 84.6
Projection Screen 74.8
Theater 60.8
Display 59.8
Monitor 59.8
Architecture 58.7
Tower 58.7
Spire 58.7
Steeple 58.7
Building 58.7
Classroom 58.2
School 58.2

Clarifai
created on 2019-05-30

monochrome 98
people 96.2
window 92.6
no person 92.3
architecture 89.8
street 88
man 88
desktop 85.9
city 85.5
building 85.5
outdoors 84.3
illustration 83.8
adult 81.8
vehicle 81.2
business 80.9
art 80.7
black and white 80.1
travel 79.4
indoors 78.9
dark 78.8

Imagga
created on 2019-05-30

house 24.2
architecture 21.2
building 20.4
home 18.3
equipment 17.8
window 17.2
device 16.6
structure 15.6
electronic equipment 14.4
urban 14
city 13.3
design 12.4
monitor 11.8
black 11.4
old 11.1
construction 11.1
sky 10.2
3d 10.1
history 9.8
modern 9.8
business 9.7
metal 9.7
style 9.6
grunge 9.4
finance 9.3
silhouette 9.1
drawing 8.8
symbol 8.7
light 8.7
sketch 8.4
film 8.4
exterior 8.3
vintage 8.3
facade 8.2
technology 8.2
wall 7.7
home appliance 7.4
retro 7.4
street 7.4
banking 7.4
art 7.3
door 7.3
interior 7.1
steel 7.1
reflection 7

Google
created on 2019-05-30

Microsoft
created on 2019-05-30

monitor 93.4
black and white 90.3
screen 74
image 30.4

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 26-43
Gender Male, 50.4%
Angry 49.5%
Calm 50.3%
Confused 49.5%
Disgusted 49.5%
Happy 49.5%
Sad 49.5%
Surprised 49.6%

Feature analysis

Amazon

Poster 88.7%
Person 86.2%

Categories

Imagga

paintings art 92.5%
food drinks 4.4%
interior objects 2.2%