Human Generated Data

Title

Untitled (navy ship and tugboat, Vietnam)

Date

1967-68

People

Artist: Gordon W. Gahan, American 1945 - 1984

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.177.1

Human Generated Data

Title

Untitled (navy ship and tugboat, Vietnam)

People

Artist: Gordon W. Gahan, American 1945 - 1984

Date

1967-68

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Gift of Barbara Robinson, 2007.184.2.177.1

Machine Generated Data

Tags

Amazon
created on 2021-12-14

Head 99.8
Person 95.6
Human 95.6
Face 85.7
Monitor 63.1
Electronics 63.1
Display 63.1
Screen 63.1
Figurine 60.8
Portrait 60
Photography 60
Photo 60
LCD Screen 55.6

Clarifai
created on 2023-10-22

people 99.8
portrait 99.7
one 99.4
monochrome 99.2
adult 98.5
man 98.2
street 95
boy 93.5
art 93
model 92.6
self 92.3
son 92.1
music 88
face 86.3
woman 86.2
indoors 85.9
child 85
girl 84.5
actor 83.1
black and white 82.1

Imagga
created on 2021-12-14

man 43.7
portrait 36.9
male 32.9
person 32.4
adult 32.4
face 32
barbershop 29.7
people 24.6
mature 24.2
handsome 24.1
shop 23.5
one 21.7
looking 21.6
old 20.2
senior 19.7
expression 19.6
hair 18.2
mercantile establishment 17.8
attractive 17.5
serious 17.2
black 16.4
casual 16.1
guy 15.7
model 15.6
human 15
negative 14
happy 13.8
eyes 13.8
men 13.7
head 13.4
elderly 13.4
smile 12.8
place of business 12.7
gray 12.6
businessman 12.4
lifestyle 12.3
grandfather 12.3
age 11.4
sexy 11.2
alone 11
eye 10.7
sad 10.6
success 10.5
close 10.3
film 10.2
dark 10
confident 10
aged 10
fitness 9.9
hand 9.9
health 9.7
retired 9.7
retirement 9.6
body 9.6
closeup 9.4
smiling 9.4
emotion 9.2
healthy 8.8
masculine 8.8
look 8.8
depression 8.8
muscular 8.6
business 8.5
professional 8.4
fit 8.3
photographic paper 7.9
wrinkled 7.8
grandmother 7.8
confidence 7.7
cellular telephone 7.7
world 7.6
horizontal 7.5
outdoors 7.5
manager 7.5
beard 7.3
indoor 7.3
suit 7.3
hairdresser 7.3
macho 7.1
child 7.1
executive 7.1

Google
created on 2021-12-14

Microsoft
created on 2021-12-14

wall 98.9
human face 98.7
indoor 96.6
person 95.9
man 86.1
portrait 81.7
text 74.9

Color Analysis

Face analysis

Amazon

AWS Rekognition

Age 40-58
Gender Female, 53.7%
Calm 90.5%
Surprised 6.2%
Sad 1.4%
Confused 0.8%
Happy 0.4%
Angry 0.3%
Disgusted 0.2%
Fear 0.2%

Feature analysis

Amazon

Person 95.6%

Categories

Imagga

paintings art 51.6%
people portraits 27.6%
food drinks 17.6%
pets animals 2.2%