Human Generated Data

Title

Untitled (photographer cutting and hanging film)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.619.5

Human Generated Data

Title

Untitled (photographer cutting and hanging film)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.619.5

Machine Generated Data

Tags

Amazon
created on 2023-10-24

Architecture 99.9
Building 99.9
Hospital 99.9
Dressing Room 98.9
Indoors 98.9
Room 98.9
Adult 98.7
Male 98.7
Man 98.7
Person 98.7
Photography 96
Face 91
Head 91
Portrait 82.3
Clinic 77.9
Furniture 66.9
Table 57.5
Operating Theatre 56.2
Photographic Film 55.9
Dining Table 55.4
Desk 55.3

Clarifai
created on 2018-10-05

people 98.8
adult 96.5
man 93.3
industry 90.3
one 89.3
science 86.8
indoors 86.7
technology 85.8
monochrome 83.8
woman 81
room 80.7
equipment 80.4
scientist 78.7
wear 78.6
vehicle 76.3
machine 76.3
research 75.5
business 73.6
vertical 73.3
rack 72.5

Imagga
created on 2018-10-05

bass 46.8
architecture 21.8
guitar 20.5
building 19.1
drawing 16.6
music 16.2
sketch 16.1
city 15.8
musical 14.4
structure 13.9
stringed instrument 13.4
tower 13.4
equipment 13.2
industry 12.8
black 12.6
instrument 12.4
steel 12.4
urban 12.2
sky 12.1
construction 12
technology 11.9
old 11.8
electric guitar 11.3
rock 11.3
metal 11.3
cable 11.2
musical instrument 10.7
sound 10.3
electric 10.3
wire 10.2
film 10.1
modern 9.8
jazz 9.8
negative 9.7
symbol 9.4
light 9.3
window 9.3
business 9.1
band 8.7
concert 8.7
power 8.4
house 8.4
device 8.3
street 8.3
retro 8.2
industrial 8.2
style 8.2
new 8.1
melody 7.8
musician 7.8
representation 7.8
play 7.7
performance 7.7
frame 7.5
vintage 7.4
man 7.4
exterior 7.4
design 7.3
bowed stringed instrument 7.2

Google
created on 2018-10-05

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 54-64
Gender Male, 86.9%
Calm 96%
Surprised 6.6%
Fear 6%
Disgusted 2.5%
Sad 2.2%
Angry 0.1%
Confused 0.1%
Happy 0.1%

Feature analysis

Amazon

Adult 98.7%
Male 98.7%
Man 98.7%
Person 98.7%

Captions

Microsoft
created on 2018-10-05

a group of people in a room 76.2%
a group of men in a room 64.7%
an old photo of a person 49%

Text analysis

Amazon

wajda