Human Generated Data

Title

Untitled (circus performer reading Time, Ringling Brothers circus train)

Date

1941, printed later

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Classification

Photographs

Credit Line

Harvard Art Museums/Fogg Museum, Transfer from the Carpenter Center for the Visual Arts, American Professional Photographers Collection, 4.2002.283

Copyright

© Estate of Joseph Janney Steinmetz

Human Generated Data

Title

Untitled (circus performer reading Time, Ringling Brothers circus train)

People

Artist: Joseph Janney Steinmetz, American 1905 - 1985

Date

1941, printed later

Classification

Photographs

Machine Generated Data

Tags

Amazon
created on 2022-01-08

Human 96.8
Person 94.4
Musical Instrument 92.6
Musician 92.6
Guitarist 83.6
Leisure Activities 83.6
Guitar 83.6
Performer 83.6
Furniture 81.1
Apparel 79.1
Clothing 79.1
Neck 58.6
Shelf 57.2

Imagga
created on 2022-01-08

musical instrument 61.1
accordion 57.3
keyboard instrument 45.9
wind instrument 38.2
person 33.1
adult 31.7
attractive 28
sexy 24.9
portrait 24.6
pretty 23.8
people 23.4
model 22.6
lady 21.9
fashion 21.9
bag 21
smile 20.7
black 19.7
studio 18.2
happy 18.2
sitting 18
hair 17.4
cute 17.2
human 16.5
women 15.8
face 15.6
dress 15.4
business 15.2
brunette 14.8
youth 14.5
smiling 14.5
man 13.4
one 13.4
style 13.4
lifestyle 13
casual 12.7
holding 12.4
clothes 12.2
laptop 12
male 11.5
computer 11.3
body 11.2
chair 11.1
work 11.1
professional 11
cheerful 10.6
device 10.5
office 10.4
communication 10.1
businesswoman 10
modern 9.8
posing 9.8
success 9.7
clothing 9.5
happiness 9.4
expression 9.4
child 9.3
20s 9.2
scholar 9.2
make 9.1
suit 9
worker 9
lovely 8.9
hat 8.8
looking 8.8
urban 8.7
hot 8.4
blond 8.3
makeup 8.2
teenager 8.2
technology 8.2
student 8.2
bass 8
job 8
interior 8
working 8
look 7.9
couple 7.8
boy 7.8
eyes 7.7
seductive 7.7
seat 7.6
career 7.6
smart 7.5
city 7.5
lips 7.4
intellectual 7.3
gorgeous 7.2
pose 7.2

Google
created on 2022-01-08

Black 89.5
Sleeve 87.2
Dress 83.5
Bag 74.7
Chair 74.1
Vintage clothing 71.5
Monochrome photography 69.8
Sitting 69.7
Pattern 69
Comfort 66.7
Room 66.2
Luggage and bags 65.9
Stock photography 65.2
Book 65.2
Office equipment 64.4
Lap 63
Font 61.8
Monochrome 59.3
Linens 55.5
Retro style 54.8

Microsoft
created on 2022-01-08

text 96.1
clothing 92.7
person 92.2
black and white 87.4
human face 86.1
book 58.4
drawing 50.2

Face analysis

Amazon

Microsoft

Google

AWS Rekognition

Age 41-49
Gender Male, 70.2%
Disgusted 94.8%
Angry 2.5%
Calm 0.9%
Happy 0.6%
Confused 0.5%
Sad 0.4%
Surprised 0.2%
Fear 0.2%

Microsoft Cognitive Services

Age 31
Gender Female

Google Vision

Surprise Very unlikely
Anger Very unlikely
Sorrow Very unlikely
Joy Very unlikely
Headwear Unlikely
Blurred Very unlikely

Feature analysis

Amazon

Person 94.4%

Captions

Microsoft

text 27.8%

Text analysis

Amazon

MILK-BONE
DOG
DOG BISCUIT
BISCUIT
MURDER
MILK
MILK BONE
BONE
igune
ligrer

Google

MILK-BONE
MILK-BONE