Human Generated Data

Title

Untitled (view of vestibule decorated with photographs and movie advertisements)

Date

c. 1920-1940, printed later

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.11171

Human Generated Data

Title

Untitled (view of vestibule decorated with photographs and movie advertisements)

People

Artist: O. B. Porter Studio, American active 1930s-1940s

Date

c. 1920-1940, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Interior Design 100
Indoors 100
Human 93.4
Person 93.4
Musical Instrument 79.2
Musician 79.2
Leisure Activities 66.4
Electronics 66.2
Screen 66.2
Guitar 63.6
Room 63.2
Vehicle 60.2
Transportation 60.2
Train 60.2
Display 59.1
Monitor 59.1
Theater 56.8

Clarifai
created on 2019-11-16

people 99.6
monochrome 98
movie 97.8
vehicle 96.4
room 96.2
man 96.1
indoors 95.5
adult 95.2
group 94.7
one 94.7
furniture 92.4
two 91.3
street 90.2
television 88.3
analogue 87.9
retro 86.3
military 86.3
war 86.2
wear 85.1
three 84.9

Imagga
created on 2019-11-16

door 26.1
white goods 25.8
home appliance 24.9
old 22.3
window 21.1
building 20.9
dishwasher 20
wall 19.7
architecture 19.5
sliding door 19.1
shop 17.2
appliance 16.6
house 15
barbershop 14.8
light 14.7
interior 14.1
windows 13.4
device 13.3
refrigerator 12.9
home 12.8
equipment 12.7
city 12.5
metal 12.1
empty 12
movable barrier 11.7
vintage 11.6
glass 10.9
aged 10.8
dirty 10.8
structure 10.8
mercantile establishment 10.7
urban 10.5
texture 10.4
brick 10.3
grunge 10.2
design 10.1
wood 10
barrier 9.2
retro 9
history 8.9
sky 8.9
television 8.8
room 8.5
travel 8.4
frame 8.3
durables 8.3
street 8.3
inside 8.3
sign 8.3
historic 8.2
elevator 8.2
pattern 7.5
dark 7.5
machine 7.5
blackboard 7.3
kitchen appliance 7.3
detail 7.2
microwave 7.2
place of business 7.1
steel 7.1
electronic equipment 7

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 99.6
indoor 91.8
black and white 90.6
person 90.6
white 89.9
clothing 81.9
black 72.4
open 59.9
man 58.3
concert 58.2

Face analysis

Amazon

AWS Rekognition

Age 34-50
Gender Male, 51.8%
Happy 45.4%
Confused 45.3%
Calm 49%
Angry 46.6%
Fear 45.1%
Disgusted 45.2%
Sad 48.3%
Surprised 45.1%

AWS Rekognition

Age 22-34
Gender Male, 54.3%
Fear 45%
Disgusted 45.2%
Calm 48.5%
Angry 45.1%
Sad 45.1%
Surprised 45.1%
Happy 50.8%
Confused 45.2%

AWS Rekognition

Age 36-52
Gender Male, 50.4%
Fear 49.6%
Calm 49.9%
Angry 49.6%
Happy 49.8%
Confused 49.5%
Disgusted 49.5%
Sad 49.5%
Surprised 49.5%

AWS Rekognition

Age 29-45
Gender Female, 54.8%
Fear 45.4%
Calm 45.1%
Surprised 45.1%
Confused 45.1%
Disgusted 45.1%
Happy 53.3%
Sad 45.8%
Angry 45.1%

Feature analysis

Amazon

Person 93.4%
Train 60.2%

Captions

Microsoft

a black and white photo of a microwave 35%
a black and white photo of a store 34.9%
a black and white photo of a store window 34.8%

Text analysis

Amazon

HEAR
HARMON
HEAR SING
SING
lose HARMON
lose

Google

HEAR ONIS LOSE HARMON WHITE
HEAR
WHITE
HARMON
ONIS
LOSE