Human Generated Data

Title

Untitled (photographer examining film)

Date

1965-1968

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.623.4

Human Generated Data

Title

Untitled (photographer examining film)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1965-1968

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.623.4

Machine Generated Data

Tags

Amazon
created on 2023-10-24

Adult 99.4
Male 99.4
Man 99.4
Person 99.4
Photography 93.3
Indoors 93.2
Face 79.5
Head 79.5
Electronics 69.3
Dressing Room 57
Room 57
Architecture 56.7
Building 56.7
Furniture 56.7
Living Room 56.7
Lighting 56.6
Electrical Device 56
Microphone 56
Bed 55.3
Bedroom 55.3
Hospital 55.3
Portrait 55.3

Clarifai
created on 2018-10-05

people 99.6
adult 98.4
indoors 97.9
room 97.2
one 95.8
man 95.6
window 94.2
wear 92.5
woman 90.9
monochrome 90.9
two 85.1
veil 82.2
furniture 81.8
mirror 81.1
portrait 80.3
family 77.3
light 77.1
door 75.5
house 75.5
home 74.5

Imagga
created on 2018-10-05

brass 73.1
cornet 61.7
wind instrument 58.1
musical instrument 44.7
man 33.6
male 29.8
horn 26.2
device 24.2
person 20.8
adult 18.2
people 16.2
men 15.5
instrumentality 14.7
portrait 14.2
trombone 13.3
playing 12.8
building 12.7
happy 12.5
businessman 12.4
fashion 12.1
violin 11.6
lifestyle 11.6
business 11.5
urban 11.4
human 11.2
outdoors 11.2
dress 10.8
city 10.8
suit 10.8
wall 10.3
black 10.2
elegance 10.1
window 10.1
sax 10.1
handsome 9.8
attractive 9.8
artifact 9.8
stringed instrument 9.5
bowed stringed instrument 9.5
professional 9.5
corporate 9.4
casual 9.3
shower 9.2
hair 8.7
couple 8.7
boy 8.7
model 8.6
modern 8.4
music 8.2
recreation 8.1
success 8
worker 8
posing 8
job 8
work 7.8
standing 7.8
play 7.8
old 7.7
guy 7.6
club 7.5
life 7.5
one 7.5
sport 7.4
exercise 7.3
clothing 7.2
musician 7.2

Google
created on 2018-10-05

Microsoft
created on 2018-10-05

wall 99.8
indoor 88.7
standing 77.2

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 33-41
Gender Female, 79%
Sad 91.9%
Calm 39.6%
Surprised 7.6%
Fear 6.6%
Happy 6.6%
Angry 2.4%
Disgusted 1.8%
Confused 1.7%

Feature analysis

Amazon

Adult 99.4%
Male 99.4%
Man 99.4%
Person 99.4%

Categories