Human Generated Data

Title

Untitled (two women standing beside doorway in bank with man behind window)

Date

1929, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11183

Human Generated Data

Title

Untitled (two women standing beside doorway in bank with man behind window)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

1929, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 97.3
Person 97.3
Floor 94.3
Flooring 91.2
Person 69.7
Sliding Door 60.9
Indoors 59.9
Art 58.2
Person 57.9
Corridor 55.3
Door 50.6

Clarifai
created on 2019-11-16

indoors 98.6
architecture 97.2
window 96.3
room 96
people 95.8
monochrome 95.4
no person 95.1
street 94.3
house 88.9
shadow 88.5
door 88.4
light 87.9
home 87.4
family 86.9
modern 85.6
inside 85.2
hallway 82.3
furniture 81.8
subway system 81.5
city 80.5

Imagga
created on 2019-11-16

sliding door 56.7
door 53.1
interior 45.1
architecture 36.2
movable barrier 35.6
building 29.4
modern 27.3
barrier 24.6
window 22.3
inside 21.2
room 21.2
office 20.9
urban 20.1
light 20.1
wall 19.9
glass 19.5
furniture 18.9
business 18.8
corridor 18.7
chair 18.6
indoor 18.3
hall 17.8
city 17.5
floor 16.7
indoors 16.7
3d 16.3
house 15.9
structure 15.5
design 15.2
empty 14.6
home 14.4
table 13.1
obstruction 12.6
decor 12.4
metal 12.1
luxury 12
station 11.8
center 11.7
transportation 11.7
entrance 11.6
apartment 11.5
steel 11.5
travel 11.3
shop 11
reflection 10.7
windows 10.6
hotel 10.5
perspective 10.4
people 10
train 10
lamp 10
hallway 9.9
subway 9.9
decoration 9.5
construction 9.4
restaurant 9.2
wood 9.2
open 9
new 8.9
gate 8.9
ceiling 8.8
mall 8.8
public 8.7
boutique 8.6
nobody 8.6
space 8.5
place 8.4
silhouette 8.3
transport 8.2
stylish 8.1
metro 7.9
doors 7.9
render 7.8
residence 7.8
mirror 7.6
kitchen 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

black and white 98.1
text 90.5
indoor 88.8
monochrome 86.8
street 85.5
white 82.9
building 81
house 72.8
black 69.4
open 63.4
window 58.2

Face analysis

Amazon

AWS Rekognition

Age 22-34
Gender Female, 50.4%
Disgusted 49.5%
Confused 49.5%
Fear 50.1%
Calm 49.5%
Happy 49.6%
Surprised 49.6%
Angry 49.6%
Sad 49.5%

AWS Rekognition

Age 16-28
Gender Male, 50%
Fear 49.9%
Sad 49.5%
Happy 49.5%
Disgusted 49.5%
Calm 49.6%
Confused 49.5%
Angry 49.6%
Surprised 49.9%

Feature analysis

Amazon

Person 97.3%
Door 50.6%

Captions

Microsoft

a black and white photo of a store window 75.6%
a black and white photo of a store 75.5%
a black and white photo of a building 75.2%

Text analysis

Amazon

C