Human Generated Data

Title

Untitled (employee holding bread baking pans)

Date

c. 1938, printed later

People

Artist: Durette Studio, American 20th century

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5757

Human Generated Data

Title

Untitled (employee holding bread baking pans)

People

Artist: Durette Studio, American 20th century

Date

c. 1938, printed later

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.5757

Machine Generated Data

Tags

Amazon
created on 2019-11-16

Human 99.4
Person 99.4
Silhouette 71.8
Screen 61.5
Electronics 61.5
Leisure Activities 60.2
Flooring 59.6
Lab 57.6
Musical Instrument 56.9
Musician 56.9
Monitor 56.2
Display 56.2
LCD Screen 56.2
Standing 55.6
Crowd 55.1

Clarifai
created on 2019-11-16

people 99.3
monochrome 98.1
music 98
man 95.8
street 90.7
analogue 90.7
one 88.5
adult 88.1
instrument 87.7
drummer 84.7
band 84.2
musician 84
drum 83.3
silhouette 82.2
art 81.8
woman 80.9
jazz 80.7
room 78.7
light 78.2
group 78.1

Imagga
created on 2019-11-16

musical instrument 93.3
electronic instrument 68
violin 42.7
stringed instrument 42.3
device 38.7
bowed stringed instrument 37
interior 27.4
modern 23.1
home 23.1
percussion instrument 22
music 21.6
room 21.3
chair 21.2
house 20
table 18.2
male 17
kitchen 16.2
man 15.4
indoors 14.9
person 14.3
instrument 14
indoor 13.7
musical 13.4
people 13.4
floor 13
piano 12.9
inside 12.9
furniture 12.7
concert 12.6
architecture 12.5
design 12.4
upright 12.3
office 12
window 11.9
style 11.9
wood 11.7
business 11.5
guitar 11.5
black 11.4
glass 10.9
lifestyle 10.8
light 10.7
decor 10.6
apartment 10.5
bass 10
equipment 9.9
musician 9.8
adult 9.7
men 9.4
silhouette 9.1
stove 9.1
cabinet 8.9
wooden 8.8
cooking 8.7
rock 8.7
work 8.6
play 8.6
desk 8.6
domestic 8.1
keyboard instrument 8.1
art 7.8
vibraphone 7.8
band 7.8
classical 7.6
player 7.5
group 7.2
board 7.2
building 7.1

Google
created on 2019-11-16

Microsoft
created on 2019-11-16

text 96.7
indoor 87.8
black and white 86.4
person 81.2
music 56.5
piano 53.2
kitchen appliance 12.5

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 18-30
Gender Male, 53.7%
Angry 45.7%
Fear 45.1%
Calm 48.9%
Sad 46.4%
Disgusted 48.6%
Happy 45.1%
Confused 45.1%
Surprised 45.1%

Feature analysis

Amazon

Person 99.4%

Categories