Human Generated Data

Title

Untitled (man looking at dollar bill mold with magnifying glass)

Date

c. 1950

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.7046

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (man looking at dollar bill mold with magnifying glass)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

c. 1950

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2021-12-15

Human 94.9
Person 94.9
Finger 87.4
Postage Stamp 68.3
Accessory 67.6
Glasses 67.6
Accessories 67.6
Face 63.2
Portrait 61
Photography 61
Photo 61

Imagga
created on 2021-12-15

negative 63.3
film 53.2
photographic paper 38.9
person 33.1
man 30.3
portrait 28.5
people 26.8
photographic equipment 25.9
adult 23.5
looking 23.2
face 22
male 22
senior 19.7
hospital 18.8
elderly 18.2
human 18
hair 17.4
doctor 16.9
medical 16.8
lifestyle 16.6
smile 16.4
smiling 15.9
mature 15.8
happy 15.7
professional 14.8
old 14.6
health 14.6
care 14
grandfather 13.7
model 13.2
holding 13.2
work 12.6
family 12.5
medicine 12.3
indoors 12.3
couple 12.2
love 11.8
eyes 11.2
men 11.2
women 11.1
thoughtful 10.7
laboratory 10.6
exam 10.5
one 10.5
home 10.4
cute 10.1
girls 10
bride 9.9
necktie 9.8
cheerful 9.8
lady 9.7
child 9.7
older 9.7
bow tie 9.7
pensioner 9.7
together 9.6
coat 9.6
husband 9.5
illness 9.5
bed 9.5
wife 9.5
females 9.5
healthy 9.5
lab coat 9.4
glasses 9.3
head 9.2
patient 9.2
groom 9.1
hand 9.1
modern 9.1
pretty 9.1
aged 9.1
clothing 8.9
businessman 8.8
scientist 8.8
look 8.8
lab 8.8
expertise 8.7
retired 8.7
sitting 8.6
serious 8.6
blond 8.5
business 8.5
lying 8.5
attractive 8.4
joy 8.4
occupation 8.3
life 8.2
baby 8.1
romantic 8
nurse 8
lovely 8
copy 8
happiness 7.8
education 7.8
bedroom 7.7
test 7.7
retirement 7.7
one person 7.5
closeup 7.4
worker 7.2
kid 7.1

Google
created on 2021-12-15

Microsoft
created on 2021-12-15

text 99.3
human face 88.8
black and white 82.6
drawing 78.7
sketch 72.5
person 67.1
glasses 61.3
clothing 54.9

Face analysis

Amazon

Google

AWS Rekognition

Age 20-32
Gender Female, 88.5%
Happy 58.1%
Calm 27.7%
Fear 7.1%
Surprised 2.7%
Sad 2.3%
Angry 1.8%
Disgusted 0.2%
Confused 0.2%

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Very unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.9%
Glasses 67.6%

Captions

Microsoft

text 97.6%

Text analysis

Amazon

3.
3. SHAMHART.
SHAMHART.
JSV